3. Adding other CRUD operations

Description

In this article the code will be refactored from the previous sample (Adding simple criteria, sorting and data paging), the correct filtering functions will be implemented for the text columns and additional functionality will be added to have a fully working DataSource implementation.

Moving the DataSource to an external file

In the previous sample, filtering by a text value would not select rows which contained the filter term, only those that were specifically equal to it. To correctly filter by a text value, it is necessary to identify which columns are of type text and which are other types. To achieve this, the declaration of the DataSource needs to be moved to an external file, which will be loaded on both the client and server side. 

Firstly, create a new folder, called 'ds' in the Solution Explorer, below App3 (as with sample 1 and 2 a new solution has been created for this sample). In this folder, create a javascript file which defines the DataSource itself. Give this file the same name as the ID of the DataSource, to have an easy method of identifying which DataSource is associated with which table. Add to this file the definition of the DataSource, (copied from from the ui.js definition in sample 1):

isc.RestDataSource.create({
	ID:"suppyItem",
	fields:[
        {name:"itemID", type:"sequence", hidden:"true", primaryKey:"true"},
        {name:"itemName", type:"text", title:"Item", length:"128", required:"true"},
        {name:"SKU", type:"text", title:"SKU", length:"10", required:"true"},
        {name:"description", type:"text", title:"Description", length:"2000"},
        {name:"category", type:"text", title:"Category", length:"128", required:"true", foreignKey:"supplyCategory.categoryName"},
        {name:"units", type:"enum", title:"Units", length:"5",
            valueMap:["Roll", "Ea", "Pkt", "Set", "Tube", "Pad", "Ream", "Tin", "Bag", "Ctn", "Box"]
        },
        {name:"unitCost", type:"float", title:"Unit Cost", required:"true",
            validators:[
                {type:"floatRange", min:"0", errorMessage:"Please enter a valid (positive) cost"},
                {type:"floatPrecision", precision:2, errorMessage:"The maximum allowed precision is 2"}
            ]
        },

        {name:"inStock", type:"boolean", title:"In Stock"},
        {name:"nextShipment", type:"date", title:"Next Shipment"}
    ],

    dataFormat: "json",

    operationBindings:[
        {operationType: "fetch", dataProtocol: "postMessage", dataURL: "/RequestHandler/fetch"}
    ]
});

Now, modify the view file for the application to load this additional DataSource. The application view is located in the '/Views/Sample/Index.aspx' file. Edit this file and change it's content to:

<%@ Page Title="" Language="C#" MasterPageFile="~/Views/Shared/SmartClient.Master" Inherits="System.Web.Mvc.ViewPage<dynamic>" %>

<asp:Content ID="Content2" ContentPlaceHolderID="head" runat="server">

</asp:Content>

<asp:Content ID="Content1" ContentPlaceHolderID="MainContent" runat="server">
<script src="/ds/supplyItem.js">
</script>
<script src="/Scripts/ui.js">
</script>
</asp:Content>

Note the additional script tag that loads the DataSource definition which was moved to the newly created supplyItem.js file.

Making use of the DataSource definition file

At this stage, the definition of the DataSource (now in a separate file), looks like this:

isc.RestDataSource.create({
    ID: "supplyItem",
    fields: [
        { name: "itemID", type: "sequence", hidden: "true", primaryKey: "true" },
        { name: "itemName", type: "text", title: "Item", length: "128", required: "true" },
        { name: "SKU", type: "text", title: "SKU", length: "10", required: "true" },
        { name: "description", type: "text", title: "Description", length: "2000" },
        { name: "category", type: "text", title: "Category", length: "128", required: "true", foreignKey: "supplyCategory.categoryName" },
        { name: "units", type: "enum", title: "Units", length: "5",
            valueMap: ["Roll", "Ea", "Pkt", "Set", "Tube", "Pad", "Ream", "Tin", "Bag", "Ctn", "Box"]
        },
        { name: "unitCost", type: "float", title: "Unit Cost", required: "true",
            validators: [
                { type: "floatRange", min: "0", errorMessage: "Please enter a valid (positive) cost" },
                { type: "floatPrecision", precision: 2, errorMessage: "The maximum allowed precision is 2" }
            ]
        },

        { name: "inStock", type: "boolean", title: "In Stock" },
        { name: "nextShipment", type: "date", title: "Next Shipment" }
    ],

    dataFormat: "json",

    operationBindings: [
        {operationType: "fetch", dataProtocol: "postMessage", dataURL: "/RequestHandler/fetch"}
    ]
});

Notice that this definition is almost a JSON file, with a small addendum at the beginning and the end. If the 'isc.RestDataSource.create(' part from the beginning and ');' part from the end were removed, this would give a valid JSON object which will be deserialized as a JSON object.
To do this an object is required to deserialize this into. At the moment, only the fields are of interest, so the object would look like this:

using System;
using System.Collections.Generic;
using System.Linq;
using System.Web;

namespace App3.Utils
{
    public class DataSource
    {
        public string ID { get; set; }
        public string dataFormat { get; set; }
        public ICollection<Dictionary<string, object>> fields { get; set; }
        public ICollection<Dictionary<string, object>> operationBindings { get; set; }

        /// <summary>
        /// Get a field's data by it's name
        /// </summary>
        /// <param name="name">The name of the field</param>
        /// <returns>A Dictionary with the attributes of the field, or null if field was not found</returns>
        public Dictionary<string, object> getField(string name)
        {
            foreach (Dictionary<string, object> field in fields)
            {
                if (field.ContainsKey("name") && field["name"].Equals(name))
                {
                    return field;
                }
            }

            return null;
        }
    }
}

The fields themselves are loaded into a collection of dictionaries. Additionally,a method has been added to get a field by it's name.

Refactoring DSRequest

As the logic is starting to get more complicated in the RequestHandlerController, it is sensible to refactor this code into a format which is easier to follow and extend with the new functionality. Therefore all the request and response processing wil be moved to a different specialized class, called RPCManager. The processing flow will also change accordingly. The controller itself will only call the RPCManager and will delegate all responsibility to it. The RPCManager will parse the payload and setup the DSRequest request and will call for the request's execute() method which will return the DSResponse object. The RPCManager will then convert this DSResponse into a suitable response and return it to the controller. As you can see, the actual work is being moved from the controller to the RPCManager, which delegates it to the DSRequest class. DSRequest in turn will delegate the calls to the DataSource itself (explained later in this article).

So, firstly, refactor the existing classes to be more generic and fit this new flow. As it is likely the data member of the DSRequest object will be required in various formats, it should be made generic. This way, when deserialization happens, it will (where possible) be in the correct format for the requirement, without the hassle of having to conver the values manually from string to value and back again. Also, the oldValues member will be parameterized, and the additional functionality functionality to delegate the processing to DataSource will also be added. Therefore, the DSRequest class should now look like this:

using System;
using System.Collections.Generic;
using System.Linq;
using System.Web;

namespace App3.Utils
{
    public class DSRequest<T>
    {
        public string dataSource { get; set; }
        public string operationType { get; set; }
        public int startRow { get; set; }
        public int endRow { get; set; }
        public string textMatchStyle { get; set; }
        public string componentId { get; set; }

        public T data { get; set; }
        public string[] sortBy { get; set; }

        public T oldValues { get; set; }

        // the RPC Manager which is executing this DSRequest
        public RPCManager RpcManager { get; set; }

        virtual public DSResponse execute()
        {
            if (RpcManager == null)
            {
                return null;
            }

            if (dataSource == null)
            {
                return null;
            }

            DataSource ds = RpcManager.getDataSource(dataSource);

            if (ds == null)
            {
                return null;
            }

            return ds.execute<T>(this);
        }
    }
}

A Couple of noteworthy comments regarding the newly refactored code:

  • A reference to the RPCManager executing this request will be stored in DSRequest .This has to be done because, while the request is being.executed, access will be required to various items such as the NHibernate session, the DataSource object, etc - These items will all be provided by the RPCManager class 
  • The execute() method itself only loads the DataSource object then calls the DataSource's execute method for processing the request.

Refactoring DSResponse

In the DSResponse object, notice that it contains an object wrapped inside another just to mimic the structure of the JSON required by the front-end, with all the properties forwarded to the internal object. In order to make the DSResponse a bit simpler, remove this inner class and property, thereby making the DSResponse a simple container for properties. However, when serializing it out to JSON it will need use anonymous objects for serializing the response to the correct JSON format:

using System;
using System.Collections.Generic;
using System.Linq;
using System.Web;

namespace App3.Utils
{
    public class DSResponse
    {
        public int status { get; set; }
        public int startRow { get; set; }
        public int endRow { get; set; }
        public int totalRows { get; set; }
        public object data { get; set; }
    }
}

The RPCManager

Now implement the RPCManager class. The RPCManager class will not be directly instantiated, instead  a static method will be called to process the request, from within the controller method. This static method will i instantiate the RPCManager for our request and execute the processing with that RPCManager:

public static ActionResult processRequest(HttpRequestBase request, HttpResponseBase response)
{
    RPCManager manager = new RPCManager(request, response);

    return manager.processARequest();
}

This will return an ActionResult, which can be returned directly by the controller. Processing this request can then be implemented:

private ActionResult processARequest()
{
    // retrieve the requests with data in form of Dictionary<string, object>
    DSRequest<Dictionary<string, object>> request = parseRequest<Dictionary<string, object>>();

    DSResponse res = null;

    // create a session
    using (ISession session = NHibernateHelper.OpenSession())
    {
        // store session in RPCManager, so requests can use them
        Session = session;

        // set the RPCManager executing this request
        request.RpcManager = this;

        //execute the request and get the response
        res = request.execute();

        // safeguard, if was null, create an empty response with failed status
        if (res == null)
        {
            res = new DSResponse();
            res.status = -1;
        }
    }

    return buildResult(res);
}

This method makes use of various helper methods. One for parsing a DSRequest object from the payload and one for building the response to be sent to the front-end. As discussed earlier in the article, this is being created as an anonymous object to be serialized to JSON (an object which mimics the layout of the required JSON). The object is created with the appropriate values, retrieved from the DSResponse object which the DSRequest method returned:

protected JsonNetResult buildResult(DSResponse response)
{
    var data = new
    {
        response = new
        {
            data = response.data,
            status = response.status,
            startRow = response.startRow,
            endRow = response.endRow,
            totalRows = response.totalRows
        }
    };

    // convert it to JSON
    JsonNetResult jsonNetResult = new JsonNetResult();
    jsonNetResult.Formatting = Formatting.Indented;
    jsonNetResult.SerializerSettings.Converters.Add(new IsoDateTimeConverter());
    jsonNetResult.SerializerSettings.NullValueHandling = NullValueHandling.Ignore;

    jsonNetResult.Data = data;

    return jsonNetResult;
}

Parsing the payload is not complicated either, as the payload has been stored in the requestBody property. The data member of DSRequest will be converted to it's specified type, by the JSON serializer itself:

protected DSRequest<T> parseRequest<T>()
{
    DSRequest<T> req = JsonConvert.DeserializeObject<DSRequest<T>>(requestBody);

    return req;
}

Additional to these helper methods, the RPCManager class contains additional methods and properties which should be used from inside the DSRequest and DataSource. One will convert the DSRequest from it's Dictionary<string,string> mapped data version to the one with a generic type:

public DSRequest<T> convertRequest<T>(DSRequest<Dictionary<string, object>> request)
{
    return parseRequest<T>();
}

and the other will load and return a DataSource specified by it's ID:

public DataSource getDataSource(string datasourceId)
{
    // Load the DataSource. The DataSources are stored in the /ds/ folder in a file
    // with same name as the id of the DataSource
    try
    {
        StreamReader sr = new StreamReader(httpRequest.MapPath("~/ds/" + datasourceId + ".js"));
        var dsStr = sr.ReadToEnd();
        sr.Close();

        dsStr = dsStr.Replace("isc.RestDataSource.create(", "");
        dsStr = dsStr.Replace(");", "");

        DataSource ds = JsonConvert.DeserializeObject<DataSource>(dsStr);

        return ds;
    }
    catch (IOException ex)
    {
        // if not found, return null
        return null;
    }
}

The DSRequest and DataSource methods can also access the Session property of the RPCManager, which will be set to the current NHibernate session through the duration of the execution of the DSReqest.

For more details and the full source code of the RPCManager please review the attached archive at the end of this article.

Refactoring the controller

Code in the controller needs to be refactored to make use of the RPCManager class. With all the functionality delegated to the RPCManager, the code in the controller becomes extremely simple:

using System;
using System.Linq;
using System.Web;
using System.Web.Mvc;

using App3.Utils;

namespace App3.Controllers
{
    public class RequestHandlerController : Controller
    {
        public ActionResult fetch()
        {
            return RPCManager.processRequest(Request, Response);
        }
    }
}

Note: The code that has been removed is no longer required here, as the functionality for processing the requests has been moved inside the DataSource class. As discussed earlier, the functionality from the controller is now resident in the DataSource class itself:

virtual public DSResponse execute<T>(DSRequest<T> request)
{
    if (request == null)
    {
        return null;
    }

    if ("fetch".Equals(request.operationType))
    {
        return executeFetch<T>(request);
    }

    return null;
}

virtual public DSResponse executeFetch<T>(DSRequest<T> request)
{
    DSRequest<Dictionary<string, object>> req = request as DSRequest<Dictionary<string, object>>;
    DSRequest<supplyItem> itmreq = request.RpcManager.convertRequest<supplyItem>(req);

    DataSource ds = request.RpcManager.getDataSource(req.dataSource);

    var query = request.RpcManager.Session.CreateCriteria<supplyItem>();

    // build the criteria
    if (req.data.Keys.Count != 0)
    {
        foreach (string key in req.data.Keys)
        {
            Dictionary<string, object> field = ds.getField(key);

            // make sure the field is in the DataSource
            if (field != null)
            {
                // get the property value
                PropertyInfo pi = itmreq.data.GetType().GetProperty(key);
                object val = pi.GetValue(itmreq.data, null);

                string type = field["type"] as string;

                if (type.Equals("text") || type.Equals("link") || type.Equals("enum") ||
                    type.Equals("image") || type.Equals("ntext"))
                {
                    query.Add(Restrictions.Like(key, "%" + val + "%"));
                    break;
                }
                else
                {
                    query.Add(Restrictions.Eq(key, val));
                    break;
                }
            }
        }
    }

    // add sorting
    if (req.sortBy != null)
    {
        // add the sorting
        foreach (string column in req.sortBy)
        {
            // if column name is with -, then ordering is descending, otherwise ascending
            if (column.StartsWith("-"))
            {
                // if sort is descending, then we have to remove the '-' from the beginning to get te correct
                // column name
                query.AddOrder(Order.Desc(column.Substring(1)));
            }
            else
            {
                query.AddOrder(Order.Asc(column));
            }
        }
    }

    // create a response object
    DSResponse dsresponse = new DSResponse();

    // set start row and number of rows on the query itself
    if (req.endRow != 0)
    {
        query.SetMaxResults(req.endRow - req.startRow);
        query.SetFirstResult(req.startRow);
    }

    // get the requested number of objects
    var products = query.List<supplyItem>();

    // change projection to get the total number of rows
    // we need to clear the result range and the ordering for this to work
    query.SetProjection(Projections.RowCount());
    query.SetFirstResult(0);
    query.SetMaxResults(int.MaxValue);
    query.ClearOrders();

    // set total rows using the projection
    dsresponse.totalRows = (int)query.UniqueResult<int>();

    // set the response data
    dsresponse.data = products;
    dsresponse.startRow = req.startRow;
    dsresponse.endRow = dsresponse.startRow + products.Count();

    // sanity check, if no rows, return 0
    if (dsresponse.endRow < 0)
    {
        dsresponse.endRow = 0;
    }

    dsresponse.status = 0;

    return dsresponse;
}
  • At the beginning of the executeFatch() method, create an additional DSRequest object, one which has the data property as supplyItem (is the NHibernate entity object). Unlike the current DSRequest object it was called with, which has the data property set as a Dictionary<string, object>. This needs to be done as the request is deserialized in the RPCManager as a DSRequest<Dictionary<string, object>> object, and by creating an instance of DSRequest<supplyItem> the deserializer will take care of converting all the required fields to the right type as it is defined in the supplyItem object.Both representations are needed to know which fields were specified as filter parameters and deserialization to supplyItem does not provide this information, as the properties which are not present will be null or their default values. This would make it impossible to figure out which are the fields which are filtered by, and not just their default value. This approach prevents having to manually convert all of the required properties to the right type when creating the Criteria for filtering.

Another approach to use to prevent manual conversion is to use reflection to get and set the property values. Looking at the above code, you will notice the following:

foreach (string key in req.data.Keys)
{
    Dictionary<string, object> field = ds.getField(key);

    // make sure the field is in the DataSource
    if (field != null)
    {
        // get the property value
        PropertyInfo pi = itmreq.data.GetType().GetProperty(key);
        object val = pi.GetValue(itmreq.data, null);

        string type = field["type"] as string;

        if (type.Equals("text") || type.Equals("link") || type.Equals("enum") ||
            type.Equals("image") || type.Equals("ntext"))
        {
            query.Add(Restrictions.Like(key, "%" + val + "%"));
            break;
        }
        else
        {
            query.Add(Restrictions.Eq(key, val));
            break;
        }
    }
}

Here a PropertyInfo objects are created  which retrieve the value of the property as the right type, without actually knowing it's type or hard-coding it's name. This can then be set in the criteria. Also note that if the field type is appropriate,  the Like restriction is used to match as a sub string of the filter criteria, otherwise Eq is used. This means, for string fields, sub string matching will be performed instead of equals as was shown in the previous article.

The rest of the code is straight forward, with the exception of retrieving the count of records. As we are now using Criteria, it is possible to use Projection to do this, which was not possible in the previous sample:

// get the requested number of objects
var products = query.List<supplyItem>();

// change projection to get the total number of rows
// we need to clear the result range and the ordering for this to work
query.SetProjection(Projections.RowCount());
query.SetFirstResult(0);
query.SetMaxResults(int.MaxValue);
query.ClearOrders();

// set total rows using the projection
dsresponse.totalRows = (int)query.UniqueResult<int>();

However there are couple of items that must be catered for: The pagination settings have to be reset (first result and the number of rows returned), otherwise the count will be wrong (0) and secondly, remove any ordering information, otherwise the query will fail with an SQL error.

Adding additional CRUD operations

At this point, this is the same functional sample as the previous example (but using criteria this time, and making use of the DataSource definition on the server side). To build on this, add the missing operations to make this a fully functioning DataSource.

Client side changes

Firstly, make the grid editable. Add the canEdit:true property to the ListGrid. Then, to allow for removing rows, add the canRemoveRecords:true property. The ListGrid definition will now look like this:

isc.ListGrid.create({
    ID: "supplyItemGrid",
    width: 700, height: 224, alternateRecordStyles: true,
    dataSource: supplyItem,
    showFilterEditor: true,
    autoFetchData:true,
    dataPageSize:20,
    canEdit:true,
    canRemoveRecords:true
});

Next, THere needs to be a method for adding new records. Add a button which, on click, will trigger a new record being available for editing in the grid. The source for this is:

isc.IButton.create({
    top: 250,
    title: "Edit New",
    click: "supplyItemGrid.startEditingNew()"
});

The DataSource definition needs rto be updated to properly manage the requests for add, update and remove operations.

After these changes, the DataSource definition should looks like this:

isc.RestDataSource.create({
    ID: "supplyItem",
    fields: [
		{ name: "itemID", type: "sequence",  primaryKey: "true" },
        { name: "itemName", type: "text", title: "Item", length: "128", required: "true" },
        { name: "SKU", type: "text", title: "SKU", length: "10", required: "true" },
        { name: "description", type: "text", title: "Description", length: "2000" },
        { name: "category", type: "text", title: "Category", length: "128", required: "true", foreignKey: "supplyCategory.categoryName" },
        { name: "units", type: "enum", title: "Units", length: "5",
            valueMap: ["Roll", "Ea", "Pkt", "Set", "Tube", "Pad", "Ream", "Tin", "Bag", "Ctn", "Box"]
        },
        { name: "unitCost", type: "float", title: "Unit Cost", required: "true",
            validators: [
                { type: "floatRange", min: "0", errorMessage: "Please enter a valid (positive) cost" },
                { type: "floatPrecision", precision: 2, errorMessage: "The maximum allowed precision is 2" }
            ]
        },

        { name: "inStock", type: "boolean", title: "In Stock" },
        { name: "nextShipment", type: "date", title: "Next Shipment" }
	],

    dataFormat: "json",

    operationBindings: [
            { operationType: "fetch", dataProtocol: "postMessage", dataURL: "/RequestHandler/fetch" },
            { operationType: "add", dataProtocol: "postMessage", dataURL: "/RequestHandler/add" },
            { operationType: "update", dataProtocol: "postMessage", dataURL: "/RequestHandler/update" },
            { operationType: "remove", dataProtocol: "postMessage", dataURL: "/RequestHandler/remove" },
        ]
});

The add, update and remove methods will be mapped to the RequestHandler controller's methods with the same names.

Server side code

For the server side code,  the existing code needs to be enhanced, using the same principles used for the fetch() method. All of the functionality will be placed inside the DataSource class, keeping the controller as simple as possible. Call the RPCManager to execute the request and return the response back to the front-end.

Remove operation

For the remove() operation, the data element of the JSON payload only contains the primary key of the object to be removed. The matching code in the DataSource is as follows:

virtual public DSResponse executeRemove<T>(DSRequest<T> request)
{
    DSRequest<supplyItem> req = request.RpcManager.convertRequest<supplyItem>(request as DSRequest<Dictionary<string, object>>);

    supplyItem itm = request.RpcManager.Session.Load<supplyItem>(req.data.itemID);

    request.RpcManager.Session.Delete(itm);
    request.RpcManager.Session.Flush();

    DSResponse dsresponse = new DSResponse();

    dsresponse.data = req.data;
    dsresponse.status = 0;

    return dsresponse;
}

Convert the request into the DSRequest<supplyItem> object and use the primary key to retrieve the entity and remove it using NHibernate. As a response send back the primary key value of the removed row, so the front-end is aware of which row has been affected

Add operation

For Add, the data element of the payload will contain the entire record that needs to be created. Simply convert the request into the DSRequest<supplyItem> object and create it straight away:

virtual public DSResponse executeAdd<T>(DSRequest<T> request)
{
    DSRequest<supplyItem> req = request.RpcManager.convertRequest<supplyItem>(request as DSRequest<Dictionary<string, object>>);

    DataSource ds = request.RpcManager.getDataSource(req.dataSource);

    request.RpcManager.Session.Save(req.data);
    request.RpcManager.Session.Flush();

    DSResponse dsresponse = new DSResponse();

    dsresponse.data = req.data;
    dsresponse.status = 0;

    return dsresponse;
}

As response data send back the entity that was created, which is the same as the request data (i.e. send back the complete request data).

Update operation

For update, the data element will contain the fields that have been changed by the update and oldValues will contain the complete old record (before any changes were made). Like  fetch,the update will be deserialized into two request objects, one DSRequest<supplyItem>, to have access to the properties and one DSRequest<Dictionary<string,object>> to have acces to the items which are updated. Then, reflection is used to set the properties (without knowing their type or hardcoding their name) in the entity object which is updated. This object is then returned as a response payload to the front-end:

virtual public DSResponse executeUpdate<T>(DSRequest<T> request)
{
    DSRequest<Dictionary<string, object>> reqmap = request as DSRequest<Dictionary<string, object>>;
    DSRequest<supplyItem> req = request.RpcManager.convertRequest<supplyItem>(reqmap);

    DataSource ds = request.RpcManager.getDataSource(req.dataSource);

    supplyItem itm = request.RpcManager.Session.Load<supplyItem>(req.data.itemID);

    // update all fields which have changed. they are defined in the data property
    // build the criteria
    if (reqmap.data.Keys.Count != 0)
    {
        foreach (string key in reqmap.data.Keys)
        {
            PropertyInfo pi = req.data.GetType().GetProperty(key);

            // get the property value
            object val = pi.GetValue(req.data, null);

            // set the property value
            pi.SetValue(itm, val, null);
        }
    }

    request.RpcManager.Session.Update(itm);
    request.RpcManager.Session.Flush();

    // create a result object to be returned
    // and copy all properties of the updated object into this one
    supplyItem data = new supplyItem();

    if (reqmap.oldValues.Keys.Count != 0)
    {
        foreach (string key in reqmap.oldValues.Keys)
        {
            PropertyInfo pi = req.data.GetType().GetProperty(key);

            // get the property value of the updated entity
            object val = pi.GetValue(itm, null);

            // store it into the new object
            pi.SetValue(data, val, null);
        }
    }

    // create the DSResponse object
    DSResponse dsresponse = new DSResponse();
    dsresponse.data = data;
    dsresponse.status = 0;

    return dsresponse;
}

Note in the above code snippet that, instead of returning the entity updated with NHibernate, a new entity is created and copied over the entity data. This is done because errors will occur due to NHibernate's lazy loading. If lazy loading is disabled, then instead of creating a new entity object and setting it as payload after updating it's properties, the entity can just be returned.

To complete the fully working sample,  the execute() method in the DataSource needs to handle these additional operations:

virtual public DSResponse execute<T>(DSRequest<T> request)
{
    if (request == null)
    {
        return null;
    }

    if ("add".Equals(request.operationType))
    {
        return executeAdd<T>(request);
    }

    if ("fetch".Equals(request.operationType))
    {
        return executeFetch<T>(request);
    }

    if ("update".Equals(request.operationType))
    {
        return executeUpdate<T>(request);
    }

    if ("remove".Equals(request.operationType))
    {
        return executeRemove<T>(request);
    }

    return null;
}

And finally, the RequestHandlerController needs to have the functionality defined to call the RPCManager for these newly introduced operations:

using System;
using System.Linq;
using System.Web;
using System.Web.Mvc;

using App3.Utils;

namespace App3.Controllers
{
    public class RequestHandlerController : Controller
    {
        public ActionResult fetch()
        {
            return RPCManager.processRequest(Request, Response);
        }

        public ActionResult remove()
        {
            return RPCManager.processRequest(Request, Response);
        }

        public ActionResult add()
        {
            return RPCManager.processRequest(Request, Response);
        }

        public ActionResult update()
        {
            return RPCManager.processRequest(Request, Response);
        }
    }
}

At this point, all four operations are implemented (add, update, remove, fetch). However please note these are just plain samples,and do not contain anything other than basic type error validation.

A visual studio solution with the complete source code can be downloaded from here.