Creating a BCS .NET Connector using Visual Studio


 

  1. Launch Visual Studio.NET 2010 and create a new Empty SharePoint project. The project should be deployed as a Farm Solution.
  2. Add a new Business Data Connectivity Model item to the project and name it as NorthwindModel.
  3. Since this model is going to connect to the Northwind SQL Server database, we will use LINQ to SQL to establish this connectivity. Right-click on the NorthwindModel item in the project and Add a new item. Click on the Visual C# node and select LINQ to SQL Classes template. Name the file as NorthwindEntityClasses.

  1. Click on the Server Explorer link in the dbml designer. In the Server Explorer window that appears, right click on the Data Connections node and click Add Connection.

  1. In the Choose Data Source dialog, select SQL Server as the data source and click Continue.
  2. Specify the Server name where the Northwind database is available and select the Northwind database from the dropdown.
  3. Expand the list of tables and select the Product table and drop it in the designer surface.

  1. Delete all columns from the table except ProductID, ProductName and UnitPrice. This will give us a class named Product which will have just 3 properties.

  1. Since we are going to be working with the Product class generated by the LINQ to SQL designer and not the Entity1 class generated automatically by Visual Studio, delete the Entity1.cs file from the project.

  1. Open the Entity1Service.cs file. This class is going to be the .NET connector that the bdcm (BDC Model) file is going to connect to. Delete all the methods that are currently present in this class.
  2. Add the following code to the class. The code contains two methods – ReadItem and ReadList. The ReadItem method takes in a ProductID as an input parameter and returns one product object as output. The ReadList method returns all the products present in the database. (Note : Verify the connection string and make sure it is valid for your environment)

public
static
Product ReadItem(int ProductId)

{


NorthwindEntityClassesDataContext northwind = new
NorthwindEntityClassesDataContext(“server=localhost;initial catalog=Northwind;integrated security=true”);

 


var query = from p in northwind.Products


where p.ProductID == ProductId


select p;


return query.ToList()[0];

}

 


public
static
IEnumerable<Product> ReadList()

{


NorthwindEntityClassesDataContext northwind = new
NorthwindEntityClassesDataContext(“server=localhost;initial catalog=Northwind;integrated security=true”);

 


var query = from p in northwind.Products


select p;


return query.ToList();

}

 

  1. Save and close the Entity1Service.cs file and open the NorthwindModel.bdcm file.
  2. In the BDC Explorer window, change the value of the Name property of Entity1 to Product.

  1. Since the Identifying attribute for our Product entity is ProductID and it’s datatype is Integer, make the necessary changes as shown below.

  1. In the BDC Explorer window, expand the ReadItem method and click on the id parameter. Change its name to ProductID.

  1. Since ProductID is an integer, expand the type identifier of ProductID and change its Type Name property to System.Int32.

  1. Expand the returnparameter and rename the Entity1 node to Product. Change the value of its Type Name property to point to the Product class instead of Entity1.

  1. Change the name and Type of the first type descriptor to ProductID and System.Int32.
  2. Change the name and Type of the second type descriptor to ProductName and System.String.
  3. Add a third type descriptor and specify UnitPrice as its Name and System.Decimal as its Type.

  1. Now expand the ReadList method. This method doesn’t have any input parameters and has just a return parameter. Click Entity1List and change its Type Name property to point to the Product class instead of Entity1. Also ensure that the IsEnumerable checkbox is enabled. You can also optionally rename Entity1List to ProductList.

  1. Delete the Entity1 node under the current node.

  1. Copy the Product node from the ReadItem method under the Entity1List node.

  1. Deploy the project.

Configuring and using the External Content Type

 

  1. Launch SharePoint Central Administration. Under the Application Management Group, click Manage Service Applications.
  2. Click on the Business Data Connectivity Service Application. You should be able to see your external content type that you just deployed in this list.
  3. If necessary Create a Profile Page for this content type.
  4. If necessary Set Permissions on this content type.
  5. Go to any valid SharePoint site and create a new External List. Link your external list with the external content type you deployed in the previous step (Product).
  6. The list should execute your ReadList method and display the list of products.

 

Setting up Search Infrastructure – Part III


Some of the content appearing in these posts is taken from the SharePoint 2010 Search Evaluation Guide which can be downloaded from here.

 

This post covers the following –

  • Creating Metadata Properties
  • Search Reports
  • Creating Keywords, Definitions and Best Bets
  • Creating Search Scopes

 

Part I of this series is available here

Part II of this series is available here

 

Creating Metadata Properties

Crawled properties represent the metadata for content that is indexed. Typically, crawled properties include column data for SharePoint list items, document properties for Microsoft Office or other binary file types, and HTML metadata in Web pages. Administrators map crawled properties to managed properties in order to provide useful search experiences. For example, an administrator might create a managed property named Client that maps to various crawled properties called Customer, Client, and Customer from different content repositories. Managed properties can then be used across enterprise search solutions, such as in defining search scopes and in applying query filters.

In this procedure you will create a custom column. You will then crawl the lists so that their columns are indexed, and then you will create a managed metadata property that maps to columns in the lists.

  1. Browse to your SharePoint site. Navigate to any existing list and create a new column called Technology in it. Edit the properties of some of the list items so that they contain a value in the newly added column.

  1. Go to the Search Service Application and start a Full Crawl on the Local SharePoint Sites content source. You need not wait until the crawl completes.

     

  2. Go to the Search Center and run a search query as follows – “<column name>:<value>” (for example – Technology:CRM). You should notice that there are no search results returned. This is because either the content source has not yet been crawled or the crawled property has not yet been mapped to a Managed Property.
  3. On the Quick Launch of the Search Service Application, in the Queries and Results section, click Metadata Properties | New Managed Property.
  4. In the Property Name text box, type Technology.
  5. Click Add Mapping.
  6. In the Select a category drop-down list, ensure that All categories is selected. In the Crawled property name box, type ows_Technology, and then click Find.
  7. Click the ows_Technology(Text) property in the search results, and then click OK.
  8. Check the Allow this property to be used in scopes check box. Click OK.
  9. Start a Full crawl of the Local SharePoint Sites content source. Wait until the crawl completes. It should take about 2-3 minutes.
  10. Navigate to the Search Center and re-run the search query. This time you should see matching items in the Search Results.

 

 

Search Reports

The following step-by-step instructions will help you get started working with search reports.

Running Administration Reports

  1. On the Quick Launch of the Search Service Application, in the Reports section, click Administration Reports.
  2. Click Search administration reports.
  3. Click each of the reports to review the information contained.
  4. On the Quick Launch, in the Reports section, click Web Analytics Reports.
  5. Click each of the links on the Quick Launch to view the different reports.

 

Creating Keywords, Definitions, Best Bets, and Synonyms

Best Bets are URLs to documents that are associated with one or more keywords. Typically these documents or sites are ones that you expect users will want to see at the top of the search results list. Best Bets are returned by queries that include the associated keywords, regardless of whether the URL has been indexed. Site collection administrators can create keywords and associate Best Bets with them.

Synonyms are words that mean the same thing as other words. For example, you might consider laptop and notebook to mean the same thing. Administrators can create synonyms for keywords that information workers are likely to search for in their organization. Additionally, synonyms that can be used to improve recall of relevant documents are stored in thesaurus files.

  1. Browse to the Search Center site. On the Site Actions menu.
  2. In the Site Collection Administration section, click Search keywords.
  3. Click Add Keyword.
  4. In the Keyword Phrase text box, type SharePoint.
  5. In the Synonyms text box, type SharePoint Foundation; SharePoint Server; Windows SharePoint Services.
  6. Click Add Best Bet.
  7. In the URL text box, type http://www.microsoft.com/sharepoint.
  8. In the Title text box, type SharePoint on the Web.
  9. In the Description text box, type SharePoint home page on http://www.microsoft.com.
  10. Click OK.
  11. Click Add Best Bet.
  12. In the URL text box, type http://msdn.microsoft.com/sharepoint.
  13. In the Title text box, type SharePoint Developer.
  14. In the Description text box, type SharePoint home page on MSDN.
  15. Click OK.
  16. In the Keyword Definition text box, type Collaboration and Search Platform.
  17. Click OK.

 

Creating Search Scopes

  1. Browse to the Search Center site. On the Site Actions menu, click Site Settings.
  2. In the Site Collection Administration section, click Search scopes.
  3. Click New Scope.
  4. In the Title text box, type File System. In the Display Groups section, check all check boxes. Click OK.
  5. In the Search Dropdown section, next to File System, click Add rules.

  1. In the Scope Rule Type section, click Web Address.
  2. In the Host Name textbox, specify your unc path (Example : \\win2k8\Documents). Click OK.
    You may be notified that the scope will be updated in a few minutes. If so, either wait the required number of minutes and then continue at step 18, or perform steps 13 through 17 and then continue at step 18.
  3. In Central Administration, go to the Search Service Application | Search Administration.
  4. In the System Status section, next to Scopes needing update, click Start update now.

  1. Switch back to your Search Center. On the Site Actions menu, click Edit Page.
  2. Edit the Search Box Web part.In the properties of the Web Part, expand the Scopes Dropdown section.
  3. In the Dropdown mode list, click Show scopes dropdown.
  4. Click OK.
  5. On the ribbon, click Save. Note that the scopes drop-down list appears, and that your new File System scope is included in the list.

 

Setting up Search Infrastructure – Part II


Some of the content appearing in these posts is taken from the SharePoint 2010 Search Evaluation Guide which can be downloaded from here.

 

This post covers the following –

  • Creating Authoritative Pages
  • Creating Federated Locations

 

Part I of this series is available here

Queries and Results Settings

The following step-by-step instructions will help you get started working with queries and results settings.

Creating Authoritative Pages

Many ingredients go in to the FAST Search for SharePoint 2010 Search Engine algorithm. They include: Contextual Relevance, Metadata Extraction, Automatic Language Detection, File Type Biasing, Click Distance, Anchor Text, URL Depth and URL Matching. When a user enters a search term into your SharePoint 2010 Search box and clicks search they are presented with a results page. A LOT goes into turning that innocent click into highly relevant results. The SharePoint 2010 Search Engine delivers highly relevant results because it has a robust search algorithm which decides how to rank the results. The search algorithm determines if a particular result (link) is on page 1 position 1 or on page 17, position 3. It’s the search algorithm that takes Contextual Relevance, Metadata Extraction, Automatic Language Detection, File Type Biasing, Click Distance, Anchor Text, URL Depth and URL Matching all into account in deciding how results rank.

Although it has served the Search Engine world well to NOT trust humans, the SharePoint 2010 Search allows you to at least influence one of the ingredients which help determine a pages rank in the form of Authoritative Pages. Authoritative Pages fall under the “Click Distance” ingredient. An Authoritative Page is a page which you have declared as, well…, somehow better than the rest. You can actually have as many Authoritative Pages as you need and at different levels of Authority. You are essentially saying that a page should be considered a better match for any given search term that qualifies it as a result candidate. Keep in mind this is merely one ingredient (or category) which is carefully scrutinized on the algorithm used for ranking results. Declaring an Authoritative Page does not guarantee it will rank well for every search term used (and it shouldn’t).

  1. In the Search Center, enter a search query (for example: SharePoint Deployment). Notice the position of a document in the search results.

  1. Copy the link to the document.
  2. In Central Administration, on the Quick Launch section of the FAST Query Search Service Application, in the Queries and Results section, click Authoritative Pages

  1. Add a new line and URL in the Most authoritative pages box.
  2. Add a new line and paste the URL in the Most Authoritative pages box and ensure that the Refresh Now checkbox is enabled. Click OK
  3. In the System Status section of the FAST Query SSA, you will see the value Computing Ranking displayed in the Background activity label. Wait for a minute or so for new rankings to be computed. When the computation is done, the value in the label will change to None.

  1. In Search Center, give the same query again and note the difference in the rank of the page in the result.

 

 

Creating Federated Locations

Federation is the concept of retrieving search results from multiple search providers, based on a single query performed by an information worker. For example, your organization might include federation with Bing.com so that results are returned by SharePoint Server and Bing.com for a given query.

  1. On the Quick Launch of the Search Service Application, in the Queries and Results section, click Federated Locations.
  2. Click Import Location and browse to the YouTube.FLD (Federated location definition) file and click OK.
  3. Once the FLD file is successfully imported, click Edit Location.
  4. In the Edit Federated Location Page, notice that the information from the FLD file has been extracted. Go to the Trigger section. Since we want to federate the search to YouTube only if the search query matches the following pattern “video Harley Davidson”, enable the Prefix radio button and enter video in the textbox.

  1. Browse to your Search Center site. In the search box, type a search term and press [ENTER] to get the Search Results page.
  2. Edit the Search Results Page. In the Right Zone, add the Federated Results web part from the Search category.

  1. Edit the Web Part and in the properties pane for the Web Part, in the Location section drop-down list, click YouTube, and then click OK.
  2. On the ribbon, click Save and Close.
  3. Enter a search query with the video prefix and you should see results from YouTube appearing in your search results.

Setting up Search Infrastructure – Part I


Some of the content appearing in these posts is taken from the SharePoint 2010 Search Evaluation Guide which can be downloaded from here.

 

This post covers the following –

  • Creating Enterprise Search Centers
  • Creating Content Sources
  • Creating Crawl Rules

 

The enterprise search features provided by SharePoint Server 2010 can be administered at the site collection level and at the Search service application level. The following sections provide step-by-step instructions for working with various aspects of enterprise search in SharePoint Server 2010. Administrators can use the Search Administration pages to manage search settings that affect all Web applications that consume the search service. Administrators will typically start here when configuring the search system. The main day-to-day operations include creating content sources, configuring crawler settings, configuring settings to improve relevance for those content sources, adding federated content repositories, and working with search reports. The following step-lists provide instructions for performing common operations in all of these scenarios.

 

Creating Enterprise Search Centers

Search Center is a site based on the Search Center site template. It provides a focused user interface that enables information workers to run queries and work with search results.

The following procedure creates a Search Center at the root Web for a site collection. This is the generally recommended approach and architecture for creating Search Center sites with SharePoint Server 2010.

  1. Click Start>All Programs>Microsoft SharePoint 2010 Products>SharePoint 2010 Central Administration.
  2. In the Application Management group, click on the Create Site Collections link.
  3. Create a new site collection in the web application of your choice. In the Title text box, type Search Center. In the Description text box, type Enterprise Search Center for SharePoint 2010.
  4. In the Web Site Address section, select /sites/ in the drop-down list, and then type search in the text box. In the Template Selection section, click the Enterprise tab. Click FAST Search Center.
  5. In the Primary Site Collection Administrator section, type your name in the text box, and then click Check Names. Click OK.
    After a short period of time, the site collection is created and the Top-Level Site Successfully Created page appears.
  6. Click the hyperlink to the new site collection to start exploring the Search Center.

 

Creating Content Sources

Content sources are definitions of systems that will be crawled and indexed. For example, administrators can create content sources to represent shared network folders, SharePoint sites, other Web sites, Exchange public folders, third-party applications, databases, and so on.

  1. Start SharePoint 2010 Central Administration.
  2. In the Application Management Section, click Manage service applications |
    FAST Content.
  3. On the Quick Launch, in the Crawling section, click Content Sources.
  4. Click New Content Source.
  5. In case you do not see a content source for SharePoint Sites already created, create it before proceeding to the next step.

  1. Create a new content Source named Documents to point to a File Share on your machine which contains a bunch of Documents. Optionally create a crawl schedule while defining this Content source.

    Note : You will need to specify a path using UNC naming conventions and may need to share the folder before you can specify the path.

  2. After the Content Source has been created, start a Full Crawl on it.


 

Creating Crawl Rules

Crawl rules specify how crawlers retrieve content to be indexed from content repositories. For example, a crawl rule might specify that specific file types are to be excluded from a crawl, or might specify that a specific user account is to be used to crawl a given range of URLs.

Crawl schedules specify the frequency and dates/times for crawling content repositories. Administrators create crawl schedules so that they do not have to start all crawl processes manually.

A crawler impact rule governs the load that the crawler places on source systems when it crawls the content in those source systems. For example, one crawler impact rule might specify that a specific content repositories that is not used heavily by information workers should be crawled by requesting 64 documents simultaneously, whereas another crawler impact rule might specify less aggressive crawl characteristics for systems that are constantly in use by information workers.

  1. On the Quick Launch of the FAST Content Service Application, in the Crawling section, click Crawl Rules.
  2. Click New Crawl Rule.
  3. Specify the path file://<<machinename>>/<<sharename>> of the content source you created earlier. Include all items in this path. Since the default content access account may not have adequate permissions to access the file share, use the Specify a different content access account option in the Specify Authentication section to specify credentials that have read access to the content source. Close the page.

     

  1. Start a crawl of the content source to make sure there are no errors in the crawl rule.
  2. Navigate to the Search Center website and enter a search query to make sure that content from the file system is appearing in the search results.

FAST Search for SharePoint 2010 – Creating a custom property extractor


  1. Open Visual Studio and create a Custom Property Extraction Dictionary. A property extraction dictionary is an xml file and defines which words will be searched for in the indexed items and indexed in the associated managed property. Each entry has a key and an optional value. The key is the string that must be present in the item. Depending on the type of property extractor (whole words or part words), the matching of the key can be case-sensitive or case-insensitive. Note: A key should not contain any apostrophes. If it does, the term will never be matched.

  1. Save the file as technologies.xml. Ensure there are no spaces or new lines after the closing dictionary tag, or the dictionary will generate an error. The custom dictionary must be saved in UTF-8 format without a byte order mark (BOM). In Visual Studio, click on File, Advanced Save Options and select Unicode (UTF-8 without signature) option for Encoding.

  1. Take the technologies.xml, rename it to wholewords_extraction1.xml.
  2. Navigate to the FASTSearch\components\resourcestore\dictionaries\matching and replace the original wholewords_extraction1.xml with the file you just created.
  3. Use either the FASTQuery service application in Central Administration or PowerShell to create a new managed property called technology and link it to a crawled property called wholewords1. If you want to use PowerShell, you can use the following script –

$c = Get-FASTSearchMetadataCrawledProperty –name wholewords1

$m = New-FASTSearchMetadataManagedProperty –name technology –type 1

$m.RefinementEnabled=1

Set-FASTSearchMetaDataManagedProperty –ManagedProperty $m

New-FASTSearchMetadataCrawledPropertyMapping -ManagedProperty $m –CrawledProperty $c

 

  1. Activate the wholewords1 property extractor by editing the FASTSearch\etc\config_data\DocumentProcessor\optionalprocessing.xml file. Scroll down in the file and activate the wholewordsextractor1 processor.

  1. Restart the document processor by executing the following command –

psctrl reset

  1. Go to the FAST Search Connector Service application and start a Full crawl on the relevant content sources. While the full crawl is in progress, you can proceed to the next stage.
  2. Go to the search center and the search results page. Configure the search results page to enable refinement on your new managed property. Switch the page in Edit Mode, by clicking on Site Actions, Edit Page.
  3. Locate the Refinement Panel webpart and edit its properties.

  1. In the EditorZone that pops up on the right hand side of the page, expand the Refinement category and click on the Ellipse button next to Filter Category Definition.

  1. Copy the entire xml contents into an xml editor like Visual Studio.NET 2010 or SharePoint Designer.
  2. Add the following xml to the file –

     

    <Category Title=”Technologies” Description=”” Type=”Microsoft.Office.Server.Search.WebControls.ManagedPropertyFilterGenerator” MetadataThreshold=”3″ NumberOfFiltersToDisplay=”10″ MaxNumberOfFilters=”20″ ShowMoreLink=”True” MappedProperty=”technology” MoreLinkText=”show more” LessLinkText=”show fewer” ShowCounts=”Count” />

     

     

  3. Copy the modified xml contents and paste them back into Filter Category Definition property. Click OK to close the EditorZone. Save and Close the page.

Note: Ensure that the Use Default Configuration checkbox is not enabled. If you leave it enabled, the modifications made to the Filter category definition will be lost when you click OK.

  1. If the crawl has completed, proceed to test the custom property extraction. Search for a keyword like deployment and you should see the presence of a new refiner called Technologies in the refinement panel.

 

  1. In this exercise, we made use of a built-in crawled property called wholewords1 as well as a built-in property extractor. In the next exercise, we will create a custom crawled property as well as create a custom property extractor instead of using the built-in ones.

 

  1. Upload the custom property extraction dictionary to the FAST Search Server 2010 for SharePoint resource store by using the Windows PowerShell command Add-FASTSearchResource or by just copy pasting it to the FASTSearch\components\resourcestore\dictionaries\matching folder.

    Add-FASTSearchResource -FilePath c:\temp\technologies.xml -Path dictionaries\matching\technologies.xml

     

  2. Then we need to configure the custom property extraction item processing stage thru an xml configuration file named CustomPropertyExtractors.xml. The file should be present in the FASTSearch\etc\config_data\DocumentProcessor folder. If this is the first time you are creating a custom property extractor, then you will need to create a new file or else add to the already existing file.
  3. Create a new file or open the already existing file CustomPropertyExtractors.xml.
  4. Add the following xml to the file. Note that the property value should be the name of the crawled property (which will be created subsequently) and dictionary name should be the name of the xml file you copied to the resourcestore folder in the earlier step.

<?xml version=”1.0″ encoding=”utf-8″?>

<extractors>

    <extractor name=”Technology terms” type=”Verbatim” property=”mytechnologyterms“>

        <dictionary name=”technologies” yield-values=”yes”/>

    </extractor>

</extractors>

 

  1. Delete the previously created managed property – technology from either Central Administration UI or PowerShell. To delete the property via PowerShell, use the following commands –

$m = Get-FASTSearchMetadataManagedProperty –ManagedProperty technology

Remove-FASTSearchMetadataManagedProperty –ManagedProperty $m

 

  1. On the administration server, type the following command:

    psctrl reset

    This resets all currently running item processors in the system, and activates the new item processing configuration.

     

  2. To use the extracted data in queries or query refinement, you must create the crawled property and map it to a managed property within the index schema. All extracted crawled properties must be in the crawled property category named MESG Linguistics with property set value 48385c54-cdfc-4e84-8117-c95b3cf8911c and Variant Type 31.

$cp = New-FASTSearchMetadataCrawledProperty -Name mytechnologyterms -Propset 48385c54-cdfc-4e84-8117-c95b3cf8911c -VariantType 31

$mp = New-FASTSearchMetadataManagedProperty -Name technology –type 1

$mp.StemmingEnabled=0

$mp.RefinementEnabled=1

$mp.MergeCrawledProperties=1

$mp.Update()

New-FASTSearchMetadataCrawledPropertyMapping -ManagedProperty $mp -CrawledProperty $cp

 

  1. Start a full crawl of all the relevant content sources from the FAST Search Connector Service Application.
  2. If you have already completed the previous exercise, then as long as the name of the managed property created in both the exercises is the same (technology), you don’t need to configure the Refinement Panel webpart. Wait for the crawls to complete.
  3. Test the search results and you should see the presence of a refiner called Technologies in the refinement panel.

     

SharePoint – List of ….


 

Querystring parameters you should not use in your pages

ContentTypeId
Type
Mode
View
ID
VersionNo
FeatureId
ListTemplate
List
RootFolder

For a complete list visit – http://blogs.technet.com/b/stefan_gossner/archive/2009/01/30/querystring-parameters-you-should-not-use-in-your-sharepoint-application.aspx

Tokens that can be used with Custom Actions

{ItemId} Returns the ID (int) of the selected List Item
{ItemUrl} Returns URL of the selected List Item
{ListId} Returns the ID (GUID) of the list associated with the Custom Action
{SiteUrl} Returns the URL of the site from where the custom action was invoked
{RecurrenceId} Returns the ID (int) of an instance of a recurrent item. For more info

Custom Action IDs

http://johnholliday.net/resources/customactions.html

Feature IDs

http://blogs.msdn.com/b/mcsnoiwb/archive/2010/01/07/features-and-their-guid-s-in-sp2010.aspx

List Template IDs

http://mosshowto.blogspot.in/2009/04/sharepoint-list-template-ids.html

Site Template IDs

http://jimecox.wordpress.com/2011/07/24/site-templates-id-in-sharepoint-2010/

Ribbon Locations

http://msdn.microsoft.com/en-us/library/ee537543.aspx

Working with CommandAction and UrlAction


CommandAction and UrlAction elements are used to do something whenever a user clicks on a Custom Action. They could do something as simple as redirect a user to a url or execute some JavaScript Code. In a subsequent post, I will demonstrate how you can execute some C# server side code whenever a Custom Action is clicked. CommandAction element is used when the location of a custom action is the Ribbon whereas UrlAction element is used whenever the location of the action is anywhere else other than the Ribbon.

Assume you have created a custom action that looks like this. Since the location of the Custom Action is the ECB, we use the UrlAction element.

    <CustomAction Id="TestAction1"
                  Location="EditControlBlock"
                  RegistrationType="List"
                  RegistrationId="101"
                  Sequence="10000"
                  Title="Test Action1">
        <UrlAction Url="/_layouts/somepage.aspx"/>
    </CustomAction>

And another one which looks like this. This time we use the CommandAction element since the location of the Custom Action is the Ribbon

    <CustomAction Id="TestAction2"
                  Location="CommandUI.Ribbon"
                  RegistrationType="List"
                  RegistrationId="101">
        <CommandUIExtension>
            <CommandUIDefinitions>
                <CommandUIDefinition Location="Ribbon.Documents.Workflow.Controls._children">
                    <Button Id="StartWorkflowButton1"
                            Image16by16="/_layouts/Images/Workflows/Check_16x16.png"
                            Image32by32="/_layouts/Images/Workflows/Check_32x32.png"
                            LabelText="Test Action"
                            Command="TestCommand"
                            Sequence="10000"
                            TemplateAlias="o1"/>
                </CommandUIDefinition>
            </CommandUIDefinitions>
            <CommandUIHandlers>
                <CommandUIHandler Command="TestCommand"
                                  CommandAction="/_layouts/somepage.aspx"
                                  >
                </CommandUIHandler>
            </CommandUIHandlers>
        </CommandUIExtension>
    </CustomAction>

Both by default, assume that the value assigned to them is a URL. So if you did something like this, then on click of the action, SharePoint will attempt to redirect the user to a page called alert(…..

<CommandUIHandler Command="StartWorkflowCommand"
                  CommandAction="alert('Hello World')">
</CommandUIHandler>

If you want to execute some JavaScript code in response to the Action, then prefix it with the javascript keyword as follows –

<CommandUIHandler Command="StartWorkflowCommand"
                  CommandAction="javascript:alert('Hello World')">
</CommandUIHandler>

You can even have multiple lines of JavaScript code like this –

                <CommandUIHandler Command="TestCommand"
                                  CommandAction="javascript:alert('Hello'); 
                                                            alert('Hello again');">
                </CommandUIHandler>

OR

        <UrlAction Url="javascript:alert('Hello'); 
                                   alert('Hello again');"/>

However if you have multiple line of JavaScript to execute, it would be much better to encapsulate them in a function like this –

                <CommandUIHandler Command="TestCommand"
                                  CommandAction="javascript:SayHello();
                                                function SayHello() {
                                                    alert('Hello');
                                                    alert('Hello Again');
                                                }">
                </CommandUIHandler>

Sometimes you may want to use the ID of the list or selected list item in your code. In such cases, SharePoint provides a list of tokens you can use. For example the {ItemId} token provides the ID of the selected list item, {ListId} provides the ID of the list associated with the Action and {SiteUrl} provides the URL of the site containing the action. There are other tokens as well, but these are the most commonly used ones.

Out of all the tokens available, {ItemId} can be used only if the location of the Custom Action is the EditControlBlock. This is because the {ItemId} token returns just one ID and the only way in which one and only one ID can be returned is if you are working with the ECB. Actions in other locations like the Ribbon can be applied to multiple list items at a time and hence cannot use the {ItemId} token.

Either ways, the tokens cannot be used INSIDE the JavaScript code. For example, if you tried to do something like this, it wouldn’t work.

        <UrlAction Url="javascript:SayHello();
                        function SayHello() {
                            alert('You selected item id :' + {ItemId});
                            alert('from list id :' + {ListId});
                        }" />

Instead the tokens should be evaluated and their values passed to functions like this –

        <UrlAction Url="javascript:SayHello({ItemId}, '{ListId}');
                        function SayHello(itemid, listid) {
                            alert('You selected item id :' + itemid);
                            alert('from list id :' + listid);
                        }" />

In the above XML, notice that {ListId} is enclosed in quotes. This is important as List IDs are Guids and need to be passed as a string value to the function. The same is not necessary for {ItemId} as ItemIds are integers.

If you tried to so the same thing in the CommandAction attribute like this, it wouldn’t work.

<CommandUIHandler Command="TestCommand"
                    CommandAction="javascript:SayHello({ItemId}, '{ListId}');
                                function SayHello(itemid, listid) {
                                    alert('You selected item id :' + itemid);
                                    alert('from list id :' + listid);
                                }">
</CommandUIHandler>

Since {ItemId} token is not valid in the CommandAction attribute, it doesn’t get evaluated and it results in a JavaScript exception. If you look at the screenshot, then the {ListId} token has been successfully evaluated, whereas the {ItemId} token has been ignored.

image

To get the selected list items, you will need to write the following JS code in the function –

CommandAction="javascript:SayHello('{ListId}');
            function SayHello(listid) {
                var selecteditems = SP.ListOperation.Selection.getSelectedItems();
                                                    
                alert('You selected item id :' + selecteditems[0].id);
                alert('from list id :' + listid);
            }">

getSelectedItems method, as expected returns an array of selected list items.

As your code gets more and more complex, it would be very cumbersome to continue writing it as a string value assigned to the CommandAction or UrlAction attribute. Not only is it cumbersome because of lack of intellisense, color coding etc., it would also be virtually impossible to debug this code in case it doesn’t work as expected. Code reuse is also something that would not be possible. For example, maintaining this kind of code would be a nightmare.

<CommandUIHandler Command="StartWorkflowCommand"
                    CommandAction="javascript:RedirectToWorkflowLaunchPage('{SiteUrl}', '{ListId}')
                                  
                    function RedirectToWorkflowLaunchPage(siteurl, listid) {
                        //Some feedback to the user, while we re-direct to the application page.
                        SP.UI.Notify.addNotification('Redirecting...');

                        //Get a list of all the selected items (the items on which the workflow is supposed to run)
                        var items = SP.ListOperation.Selection.getSelectedItems();

                        //Iterate thru the array of selected items and create a comma seperated string consisting of all the selected item ids.
                        var selecteditems = '';
                        for (i in items) {        
                            selecteditems += items[i].id + ',';
                        }

                        //strip out the last ','
                        selecteditems = selecteditems.substr(0,selecteditems.length-1);

                        //redirect the user to a dummy aspx page.
                        //We pass the id of the list and the selected items in the querystring to the page
                        window.location = siteurl + '/_layouts/Workflows/workflowlauncher.aspx?listid=' + listid + ' &amp; itemids=' + selecteditems;
                    }"
                                  
                    EnabledScript="javascript:CheckSelectedItems()">                    
</CommandUIHandler>

To overcome all these issues, we can take our JavaScript code and move it to an external JS file.

The CommandAction or UrlAction attribute now becomes so much more cleaner.

<CommandUIHandler Command="StartWorkflowCommand"
                    CommandAction="javascript:RedirectToWorkflowLaunchPage('{SiteUrl}', '{ListId}')"                                  
                    EnabledScript="javascript:CheckSelectedItems()">                    
</CommandUIHandler>

In order to ensure that the JS file in which you have embedded your code is available to the pages in which your custom action is going to be rendered, you need to create another action and specify it’s location as ScriptLink. The ScriptSrc attribute points to the JS file.

    <CustomAction Id="ScriptSource1"
                  ScriptSrc="Workflows/RibbonActionCommands.js"
                  Location="ScriptLink"
                  Sequence="100">
    </CustomAction>

Once this is done, not only is it easy to debug the JavaScript code, we can also use the Developer Tools provided by the browser to debug it easily as well.

To debug it, follow these steps –

  1. Open the page containing the action in your browser. I am using IE 9
  2. Press F12 to open the Developer Tools window.
  3. If you are working with IE, then go to the Scripts tab. Use the scripts dropdown and select the .JS file containing your JavaScript code.

image

 

4. Put a breakpoint on the line from where you want to start debugging and click the Start debugging button.

5. Click on your custom action and you should hit your breakpoint and start debugging your code.

How to launch a workflow on more than one list item simultaneously


A very common question that pops up during my training sessions is how does one automatically launch a workflow instance on a list item.

To do this, first and foremost create a Custom Action and display it on the Ribbon. The following xml added to an Empty Element in a SharePoint Project will give us the Custom Action.

<Elements xmlns="http://schemas.microsoft.com/sharepoint/">
    <CustomAction Id="ScriptSource1"
                  ScriptSrc="Workflows/RibbonActionCommands.js"
                  Location="ScriptLink"
                  Sequence="100">
    </CustomAction>

    <CustomAction Id="StartWorkflowAction"
                  Location="CommandUI.Ribbon"
                  RegistrationType="List"                  
                  RegistrationId="101">
        <CommandUIExtension>
            <CommandUIDefinitions>
                <CommandUIDefinition Location="Ribbon.Documents.Workflow.Controls._children">
                    <Button Id="StartWorkflowButton1"
                            Image16by16="/_layouts/Images/Workflows/Check_16x16.png"
                            Image32by32="/_layouts/Images/Workflows/Check_32x32.png"
                            LabelText="Start Approval Workflow"
                            Command="StartWorkflowCommand"
                            Sequence="10000"
                            TemplateAlias="o1"/>
                </CommandUIDefinition>
            </CommandUIDefinitions>
            <CommandUIHandlers>
                <CommandUIHandler Command="StartWorkflowCommand"
                                  CommandAction="javascript:RedirectToWorkflowLaunchPage('{SiteUrl}', '{ListId}')"
                                  EnabledScript="javascript:CheckSelectedItems()">                    
                </CommandUIHandler>
            </CommandUIHandlers>
        </CommandUIExtension>
    </CustomAction>                  
</Elements>

Add the Layouts mapped folder to the solution and in it, add a JS file (named RibbonActionCommands.js in this example) which will hold the JavaScript code for the Ribbon Button. Add another application page called “WorkflowLauncher.aspx” which will contain the C# code to programmatically launch a workflow on the selected list items.

image

In the RibbonActionCommands.js file, add the following JavaScript code –

function RedirectToWorkflowLaunchPage(siteurl, listid) {
    //Some feedback to the user, while we re-direct to the application page.
    SP.UI.Notify.addNotification('Redirecting...');

    //Get a list of all the selected items (the items on which the workflow is supposed to run)
    var items = SP.ListOperation.Selection.getSelectedItems();

    //Iterate thru the array of selected items and create a comma seperated string consisting of all the selected item ids.
    var selecteditems = "";
    for (i in items) {        
        selecteditems += items[i].id + ",";
    }

    //strip out the last ','
    selecteditems = selecteditems.substr(0,selecteditems.length-1);

    //redirect the user to a dummy aspx page.
    //We pass the id of the list and the selected items in the querystring to the page
    window.location = siteurl + "/_layouts/Workflows/workflowlauncher.aspx?listid=" + listid + "&itemids=" + selecteditems;
}

function CheckSelectedItems() {
    //This function checks if more than one list item has been selected; only then does it return true. 
    //As long as this function returns false, the Ribbon Button will remain disabled.
    var items = SP.ListOperation.Selection.getSelectedItems();
    return items.length >= 1;
}
<asp:Content ID="Main" ContentPlaceHolderID="PlaceHolderMain" runat="server">
    <asp:Label runat="server" ID="lblStatus"></asp:Label>
</asp:Content>

In the code behind file, add the following code in the Page Load event handler

        protected void Page_Load(object sender, EventArgs e)
        {
            string listid = Request.QueryString["listid"].ToString();
            string itemids = Request.QueryString["itemids"].ToString();
            string[] ids = itemids.Split(',');

            //starting a workflow on list items will involve updating some infomation in the Content DBs.
            //Since these updates will be happening in response to an HTTP GET request, they will be disallowed by default
            SPContext.Current.Web.AllowUnsafeUpdates = true;

            //get a reference to the list            
            SPList list = SPContext.Current.Web.Lists[new Guid(listid)];

            //get a reference to the workflow we want to start. 
            //In this example, we assume that the workflow we need to start is named 'Approval'
            SPWorkflowAssociationCollection workflows = list.WorkflowAssociations;
            SPWorkflowAssociation association = null;
            foreach (SPWorkflowAssociation item in workflows)
            {
                if (item.Name == "Approval")
                    association = item;                
            }

            //We couldn't find a workflow named 'Approval' associated with the list/library
            if (association == null)
                lblStatus.Text = "<h3>There is no workflow named 'Approval' associated with this Library</h3>";
            else
            {
                //Start long operation
                SPLongOperation op = new SPLongOperation(this.Page);
                op.LeadingHTML = "Please wait while the Approval workflow starts on the selected items";
                op.Begin();

                using (var manager = SPContext.Current.Site.WorkflowManager)
                {                    
                    foreach (string id in ids)
                    {
                        //item on which the workflow should run
                        SPListItem item = list.Items.GetItemById(Int32.Parse(id));

                        //start workflow and pass anything you want to it, or in this case null
                        manager.StartWorkflow(item, association, association.AssociationData);
                    }
                }

                //End Long operation
                op.End(list.DefaultViewUrl);
            }
            SPContext.Current.Web.AllowUnsafeUpdates = false;                
        }

Since initiating a workflow on multiple list items could potentially take a long time to complete, we use the SPLongOperation class to delimit that operation. As a result, the user sees the following familiar page that SharePoint always pulls up whenever a long running operation is in progress.

image

The end result is that the Approval workflow automatically starts running on all the selected items

image

Consuming WCF REST services using JQuery and C#


The objective of this post is to demonstrate how WCF RESTful services can be used to perform CRUD operations against a data source and then consume it using a JavaScript as well as a .NET client using C#.

Assuming you have a WCF service configured to use the webHttpBinding, you could have operations defined as shown below that allows you to Create, Read, Update and Delete a particular record (in this example a record from the Category table in the Northwind database).

//Returns a list of Categories

[WebGet(ResponseFormat=WebMessageFormat.Json)]


public
List<Category> GetCategories()

{


NorthwindDataContext northwind = new
NorthwindDataContext();


var q = northwind.Categories;


return q.ToList();

}

The WebGet attribute will allow an HTTP GET request to be made to invoke this operation. Because we haven’t specified any UriTemplate for this operation, the url to which the GET request will have to be made will be something like – http://server:port/Service1.svc/GetCategories. Also thanks to the ResponseFormat attribute, the list of categories will be sent in JSON format.

The next operation returns just one single Category object.


//Returns an individual category object

[WebGet(UriTemplate=“Get/Category/{id}”, ResponseFormat=WebMessageFormat.Json)]


public
Category GetCategory(string id)

{


NorthwindDataContext northwind = new
NorthwindDataContext();


var category = northwind.Categories.FirstOrDefault(c=> c.CategoryID == Convert.ToInt32(id));


return category;

}

This time since we have specified an UriTemplate, the url to retrieve the category for categoryid=2 will be something like – http://server:port/Service1.svc/Get/Category/5. The placeholder {id} in the url matches the string input parameter of this method. Note that even though I would’ve wanted to change the data type of the input parameter to an int, I have to specify it as a string.

The next operation allows us to make an HTTP GET request and pass the categoryid of the category we want to delete.

//Deletes a category

[WebGet(UriTemplate = “/Delete/Category/{id}”, ResponseFormat=WebMessageFormat.Json)]


public
bool DeleteCategory(string id)

{


try

{


NorthwindDataContext northwind = new
NorthwindDataContext();


Category category = northwind.Categories.FirstOrDefault(c => c.CategoryID == Convert.ToInt32(id));

northwind.Categories.DeleteOnSubmit(category);

northwind.SubmitChanges();


return
true;

}


catch (Exception)

{


return
false;

}

}

The next operation is slightly different from the others defined so far. It doesn’t take in a primitive value (like a string) as an input parameter. Instead it takes in a complex type like a Category object as an input parameter. Also since a complex object cannot be passed thru the querystring, this time we use the WebInvoke attribute instead of WebGet. WebInvoke by default, allows an HTTP POST request to be made to invoke the operation. Since we have specified the RequestFormat attribute to be JSON, the category object will have to be sent in that format (the default being XML).


//Adds a new category.

[WebInvoke(UriTemplate=“/Add/Category”, RequestFormat=WebMessageFormat.Json, ResponseFormat=WebMessageFormat.Json)]


public
bool AddCategory(Category category)

{


try

{


NorthwindDataContext northwind = new
NorthwindDataContext();

northwind.Categories.InsertOnSubmit(category);

northwind.SubmitChanges();


return
true;

}


catch (Exception)

{


return
false;

}

}

Lastly the EditCategory operation allows an existing category to be modified and it is very similar to the previous Add operation with nothing new to discuss.

[WebInvoke(UriTemplate=“/Edit/Category”, RequestFormat=WebMessageFormat.Json, ResponseFormat = WebMessageFormat.Json)]


public
bool EditCategory(Category category)

{


try

{


NorthwindDataContext northwind = new
NorthwindDataContext();


var categoryFromDB = northwind.Categories.FirstOrDefault(c => c.CategoryID == category.CategoryID);

categoryFromDB = category;

northwind.SubmitChanges();


return
true;

}


catch (Exception)

{


return
false;

}

}

Before moving on to building the client application to consume this service, it would be a good idea to test some of these operations in the browser and verify that they are working as expected.

The Client

To consume this RESTful service, I started off with a new HTML page and added the following markup


<form>


<h2>Categories</h2>


<ul
id=”Categories”></ul>


<p></p>


<label
for=”Category ID”>Category ID &nbsp;&nbsp;&nbsp;&nbsp;&nbsp; : </label>


<input
type=”text” id=”txtCategoryID” style=”width:75px”
/>


<br />


<label
for=”txtCategoryName”>Category Name :</label>


<input
type=”text” id=”txtCategoryName”
/>


<br />


<label
for=”txtDescription”>Description &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; :</label>


<input
type=”text” id=”txtDescription” style=”width:400px”
/>


<p></p>


<input
type=”button” id=”btnEdit” value=”Save Changes”
/>



<input
type=”button” id=”btnAdd” value=”Add new Category”
/>


<input
type=”button” id=”btnDeleteCategory” value=”Delete Category”
/>


<p></p>


<span
id=”spnStatus”></span>


</form>

To make the JavaScript programming easier, I added the JQuery and the JSON2.js files to the page.


<script
src=”Scripts/jquery-1.7.2.js” type=”text/javascript”></script>


<script
src=”Scripts/json2.js” type=”text/javascript”></script>

In another <script> tag, I added a global variable called url which points to the relative url of my web service and an event handler for the document ready event.


<script
type=”text/javascript”>


var url = ‘Service1.svc/’;

$(document).ready(function () {

GetAllCategories();

     });

The implementation of the GetAllCategories method is as follows –


function GetAllCategories() {


//clear the previous values in some html elements

$(‘spnStatus’).html();

$(‘#Categories li’).remove();

$(‘#txtCategoryID’).val();

$(‘#txtCategoryName’).val();

$(‘#txtDescription’).val();


//Call the GetCategories method in the service

$.getJSON(url + “GetCategories”, null, function (data) {


for (var i = 0; i < data.length; i++) {

$(‘#Categories’).append(“<li><a href=javascript:GetCategory(“ + data[i].CategoryID + “)>” + data[i].CategoryName + “</a></li>”);

}

});

}

The JQuery getJSON method makes an HTTP GET request to the Service1.svc/GetCategories url. The second parameter is any data that you want to pass along with the request (in this case, null). The third parameter is the callback function that gets invoked whenever the request completes. Since we are using the getJSON method, the output received from the webservice is automatically serialized and parsed as a JSON object without any further intervention from us. All I do here is iterate thru all the categories returned and add each one as a li (listitem) to the ul (unordered list).

The html page is initially rendered as follows –

The anchor tag of each category has been set to call a function called GetCategory. The implementation of the function is as follows.


function GetCategory(categoryid) {


//Call the GetCategory method in the service

$.getJSON(url + “Get/Category/” + categoryid, null, function (data) {


if (data == null) {

$(‘#spnStatus’).html(‘category not found’);

}


else {


//populate the html elements with various property values

$(‘#txtCategoryID’).val(data.CategoryID);

$(‘#txtCategoryName’).val(data.CategoryName);

$(‘#txtDescription’).val(data.Description);

}

});

}

In this function we use the same getJSON method to make a HTTP GET request to Service1.svc/Get/Category/{id} url. The category object returned by the service will be automatically parsed as a JSON object and we can just use various properties like CategoryID and CategoryName to bind it to various textboxes.

With a particular category now displayed, we can now click on the Delete Category button to delete it. The Delete Category button’s click event is wired up as follows (in the document ready event handler) –

$(‘#btnDeleteCategory’).click(function () {

$.getJSON(url + “Delete/Category/” + $(‘#txtCategoryID’).val(), null, function (data) {


if (data) {

$(‘#spnStatus’).html(‘category sucessfully deleted’);

GetAllCategories();

}


else

$(‘#spnStatus’).html(‘category could not be deleted’);

});

});

This function continues to make an HTTP GET request to a URL, passing the category id that needs to be deleted as part of the URL.

The Add new category and Save Changes button technically do the same thing; hence only one of them is discussed here. The implementation of the Add Category button is as shown below –

$(‘#btnAdd’).click(function () {


//construct a category object


var category = {

CategoryName: $(‘#txtCategoryName’).val(),

Description: $(‘#txtDescription’).val()

};


//use JSON API to convert the object as a string


var s = JSON.stringify(category);


var request = $.ajax(url + “Add/Category”, {

type: “POST”, // POST Request

data: s, // stringified version of Category object

dataType: “json”, // output returned by the service should be converted to json format

contentType: “application/json; charset=utf-8”

}).done(function (data) {


// success callback

GetAllCategories();

$(‘#spnStatus’).html(‘category sucessfully added’);

}).fail(function (data) {


// failure callback

$(‘#spnStatus’).html(‘category could not be added’);

});

});

The function first constructs a new Category object and populates its various properties. CategoryID is left out since it is set as an identity column in the database. Since we are supposed to be passing input to the service in the form of a JSON object, we have to take the Category object and convert it to a JSON formatted object. The JSON.stringify method does that. Then we use JQuery’s ajax method to invoke the service. We can’t use the getJSON method we had used previously because we now need to make a POST request instead of a GET request. We could have used the post method, however the post method does not allow us to change the contentType. The default contentType used is ‘application/x-www-form-urlencoded; charset=UTF-8’, which will not work with our web service. The done and fail methods finally allow us to specify the success and failure callback functions.

To add a new Category, fill up the relevant details and click on the “Add new Category” button.

Intercepting the request in a tool like Fiddler allows me to view the raw HTTP Request and Response headers. The request looks as follows –

The response returned by the web service looks like this –

Now that I can see the raw input and output, it would be very easy to perform the same operation using C# code instead of JavaScript.

The following C# code makes a HTTP POST request to add a new category object. The Category object needs to be constructed carefully. It needs to be like this –

{“CategoryName”:”Bakery Products”,”Description”:”Breads, crackers, pasta, and cereal”}

And not like this

{‘CategoryName’:’Bakery Products’,’Description’:’Breads, crackers, pasta, and cereal’}


//the category object in JSON format


string postData = “{\”CategoryName\” : \”Test from .NET\”, \”Description\” : \”This category was inserted from a C# application\”}”;


//convert the data as a byte array


byte[] byteArray = Encoding.UTF8.GetBytes(postData);


//create an HTTP request to the URL that we need to invoke


HttpWebRequest request = (HttpWebRequest)WebRequest.Create(http://localhost:49779/Service1.svc/Add/Category&#8221;);

request.ContentLength = byteArray.Length;

request.ContentType = “application/json; charset=utf-8”; //set the content type to JSON

request.Method = “POST”; //make an HTTP POST


using (Stream dataStream = request.GetRequestStream())

{


//initiate the request

dataStream.Write(byteArray, 0, byteArray.Length);

}


// Get the response.


WebResponse response = request.GetResponse();

All these examples used a web service which was expecting data in JSON format. If you plan to invoke these operations using JavaScript, then JSON will definitely be the format of choice. But if you are using C# or any other language other than JavaScript to interact with the service, you may prefer to work with XML instead of JSON. If that’s the case, change the signature of the AddCategory method as follows –

The C# code would change in just two places. Instead of creating a JSON representation of our object, we will need to create an XML representation. Secondly the contentType property of the HTTPRequest object would change to application/xml. The final code would be –


//category object in XML format


string postData =


“<Category xmlns=’http://schemas.datacontract.org/2004/07/WebApplication1&#8242; xmlns:i=’http://www.w3.org/2001/XMLSchema-instance’>&#8221; +


“<CategoryName>Bakery Products</CategoryName>” +


“<Description>Breads, crackers, pasta, and cereal</Description>” +


“</Category>”;


byte[] byteArray = Encoding.UTF8.GetBytes(postData);


//create an HTTP request to the URL that we need to invoke


HttpWebRequest request = (HttpWebRequest)WebRequest.Create(http://localhost:49779/Service1.svc/Add/Category&#8221;);

request.ContentLength = byteArray.Length;

request.ContentType = “application/xml”; //set the content type to XML

request.Method = “POST”; //make an HTTP POST


using (Stream dataStream = request.GetRequestStream())

{


//initiate the request

dataStream.Write(byteArray, 0, byteArray.Length);

}


// Get the response.


WebResponse response = request.GetResponse();

So that’s it. That’s how a RESTful service can be invoked using both JavaScript as well as C#.

Restricting Search Results to contain only List Items / Documents


A very common request that comes up every now and then is to omit views, sites, lists/libraries etc. from Search results and only show list items/documents.

For example, in my test environment, if I search for a term “SharePoint“, I get 23 search results. Some of the items in the search results are site names or view names as shown below –

While not questioning the value or validity of these items in the search results, sometimes it may be necessary to make sure only documents or list items are included. So if I re-submit my search for SharePoint, but this time by suffixing a property called isDocument, I get only 14 items returned in the results. All those 14 items are documents. I have a list item in the Task and Announcements list which contains the word SharePoint in its Title, but those have been omitted as well.

If I alter my query to include a property called contentclass having a value starting with STS_ListItem, I get back 16 items. Two additional list items (one from the Announcements and the other from the Tasks list) are now present in the search results.

So achieving this was pretty simple, but how do you know what properties to include in your search and what values to assign to them. This is where you need to inspect the raw xml returned by the Query Service. The raw xml gives us a wealth of information related to the search results. From the XML I can see a list of managed properties returned and this is how I can take advantage of them to limit the information returned from search.

In case you are interested in knowing how to get search results information in raw xml format, refer to my blog post – Viewing Search Results in raw XML Format.

If you are not comfortable telling your end-users to always include a property in their search queries, you could add the property name:value as an additional query term.

  1. To accomplish this, edit the Search Box webpart.
  2. Under the Query Text Box section, specify the property name:value in the Additional query terms property. Uncheck the Append Additional terms to the query checkbox to hide this additional query term from the end-user by not having it echoed in the Search Box when the page is refreshed.

  1. Save and Close the page. Now when I search for SharePoint, even without specifying any other search limiter, only documents matching the search term are displayed in the results.

     

%d bloggers like this: