Thursday, October 16, 2014

Create and Configure Search Service Application in SharePoint 2013 using PowerShell



The core search architecture of SharePoint 2013 has a more complex and flexible topology that can be changed more efficiently by using Windows PowerShell. Each Search service application has its own search topology. If you create more than one Search service application in a farm, it’s recommended to allocate dedicated servers for the search topology of each Search service application.

In this blog, we will see how to configure topology for one search service application with multiple search components across 2 servers for redundancy and performance.

#==============================================================
#Search Service Application Configuration Settings
#==============================================================

$SearchApplicationPoolName = " SearchApplicationPool"$SearchApplicationPoolAccountName = "Contoso\Administrator"$SearchServiceApplicationName = "Search Service Application"$SearchServiceApplicationProxyName = "Search Service Application Proxy"$DatabaseServer = "2013-SP"$DatabaseName = "SP2013 Search"$IndexLocationServer1 = "D:\SearchIndexServer1"mkdir -Path $IndexLocationServer1 -Force$IndexLocationServer2 = "D:\SearchIndexServer2"mkdir -Path $IndexLocationServer2 -Force

#==============================================================
#Search Application Pool
#==============================================================

Write-Host -ForegroundColor DarkGray "Checking if Search Application Pool exists"$SPServiceApplicationPool = Get-SPServiceApplicationPool -Identity$SearchApplicationPoolName -ErrorAction SilentlyContinueif (!$SPServiceApplicationPool){ Write-Host -ForegroundColor Yellow "Creating Search Application Pool"$SPServiceApplicationPool = New-SPServiceApplicationPool -Name$SearchApplicationPoolName -Account $SearchApplicationPoolAccountName -Verbose}

#==============================================================
#Search Service Application
#==============================================================

Write-Host -ForegroundColor DarkGray "Checking if SSA exists"$SearchServiceApplication = Get-SPEnterpriseSearchServiceApplication-Identity $SearchServiceApplicationName -ErrorAction SilentlyContinueif (!$SearchServiceApplication){ Write-Host -ForegroundColor Yellow "Creating Search Service Application"$SearchServiceApplication = New-SPEnterpriseSearchServiceApplication -Name$SearchServiceApplicationName -ApplicationPool $SPServiceApplicationPool.Name-DatabaseServer $DatabaseServer -DatabaseName $DatabaseName}Write-Host -ForegroundColor DarkGray "Checking if SSA Proxy exists"$SearchServiceApplicationProxy = Get-SPEnterpriseSearchServiceApplicationProxy-Identity $SearchServiceApplicationProxyName -ErrorAction SilentlyContinueif (!$SearchServiceApplicationProxy){ Write-Host -ForegroundColor Yellow "Creating SSA Proxy"New-SPEnterpriseSearchServiceApplicationProxy -Name$SearchServiceApplicationProxyName -SearchApplication$SearchServiceApplicationName}

#==============================================================
#Start Search Service Instance on Server1
#==============================================================

$SearchServiceInstanceServer1 = Get-SPEnterpriseSearchServiceInstance -local Write-Host -ForegroundColor DarkGray "Checking if SSI is Online on Server1" if($SearchServiceInstanceServer1.Status -ne "Online") { Write-Host -ForegroundColor Yellow "Starting Search Service Instance" Start-SPEnterpriseSearchServiceInstance -Identity $SearchServiceInstanceServer1 While ($SearchServiceInstanceServer1.Status -ne "Online") { Start-Sleep -s 5 } Write-Host -ForegroundColor Yellow "SSI on Server1 is started" }

#==============================================================
#Start Search Service Instance on Server2
#==============================================================

$SearchServiceInstanceServer2 = Get-SPEnterpriseSearchServiceInstance -Identity
"2013-SP-AFCache" Write-Host -ForegroundColor DarkGray "Checking if SSI is Online on Server2" if($SearchServiceInstanceServer2.Status -ne "Online") { Write-Host -ForegroundColor Yellow "Starting Search Service Instance" Start-SPEnterpriseSearchServiceInstance -Identity $SearchServiceInstanceServer2 While ($SearchServiceInstanceServer2.Status -ne "Online") { Start-Sleep -s 5 } Write-Host -ForegroundColor Yellow "SSI on Server2 is started" }

#==============================================================
#Cannot make changes to topology in Active State.#Create new topology to add components
#============================================================== $InitialSearchTopology = $SearchServiceApplication |
Get-SPEnterpriseSearchTopology -Active
$NewSearchTopology = $SearchServiceApplication | New-SPEnterpriseSearchTopology

#==============================================================
#Search Service Application Components on Server1#Creating all components except Index (created later)
#==============================================================

New-SPEnterpriseSearchAnalyticsProcessingComponent -SearchTopology
$NewSearchTopology -SearchServiceInstance $SearchServiceInstanceServer1 New-SPEnterpriseSearchContentProcessingComponent -SearchTopology
$NewSearchTopology -SearchServiceInstance $SearchServiceInstanceServer1 New-SPEnterpriseSearchQueryProcessingComponent -SearchTopology
$NewSearchTopology -SearchServiceInstance $SearchServiceInstanceServer1 New-SPEnterpriseSearchCrawlComponent -SearchTopology $NewSearchTopology
-SearchServiceInstance $SearchServiceInstanceServer1
New-SPEnterpriseSearchAdminComponent -SearchTopology $NewSearchTopology
-SearchServiceInstance $SearchServiceInstanceServer1

#==============================================================
#Search Service Application Components on Server2.#Crawl, Query, and CPC
#==============================================================

New-SPEnterpriseSearchContentProcessingComponent -SearchTopology
$NewSearchTopology -SearchServiceInstance $SearchServiceInstanceServer2 New-SPEnterpriseSearchQueryProcessingComponent -SearchTopology
$NewSearchTopology -SearchServiceInstance $SearchServiceInstanceServer2 New-SPEnterpriseSearchCrawlComponent -SearchTopology
$NewSearchTopology -SearchServiceInstance $SearchServiceInstanceServer2


Server1
Primary
Server2
Primary
IndexPartition 0
IndexComponent 1

True
IndexComponent 2
False
IndexPartition 1
IndexComponent 3
False

IndexComponent 4

True
IndexPartition 2
IndexComponent 5

True
IndexComponent 6
False
IndexPartition 3
IndexComponent 7
False
IndexComponent 8

True



#==============================================================
#Index Components with replicas
#==============================================================

New-SPEnterpriseSearchIndexComponent -SearchTopology $NewSearchTopology
-SearchServiceInstance $SearchServiceInstanceServer1 -IndexPartition 0
-RootDirectory $IndexLocationServer1

New-SPEnterpriseSearchIndexComponent -SearchTopology $NewSearchTopology
-SearchServiceInstance $SearchServiceInstanceServer2 -IndexPartition 0
-RootDirectory $IndexLocationServer2

New-SPEnterpriseSearchIndexComponent -SearchTopology $NewSearchTopology
-SearchServiceInstance $SearchServiceInstanceServer2 -IndexPartition 1
-RootDirectory $IndexLocationServer2

New-SPEnterpriseSearchIndexComponent -SearchTopology $NewSearchTopology
-SearchServiceInstance $SearchServiceInstanceServer1 -IndexPartition 1
-RootDirectory $IndexLocationServer1

New-SPEnterpriseSearchIndexComponent -SearchTopology $NewSearchTopology
-SearchServiceInstance $SearchServiceInstanceServer1 -IndexPartition 2
-RootDirectory $IndexLocationServer1

New-SPEnterpriseSearchIndexComponent -SearchTopology $NewSearchTopology
-SearchServiceInstance $SearchServiceInstanceServer2 -IndexPartition 2
-RootDirectory $IndexLocationServer2

New-SPEnterpriseSearchIndexComponent -SearchTopology $NewSearchTopology
-SearchServiceInstance $SearchServiceInstanceServer2 -IndexPartition 3
-RootDirectory $IndexLocationServer2

New-SPEnterpriseSearchIndexComponent -SearchTopology $NewSearchTopology
-SearchServiceInstance $SearchServiceInstanceServer1 -IndexPartition 3
-RootDirectory $IndexLocationServer1

#==============================================================
#Setting Search Topology using Set-SPEnterpriseSearchTopology
#==============================================================
Set-SPEnterpriseSearchTopology -Identity $NewSearchTopology

#==============================================================
#Clean-Up Operation
#==============================================================
Write-Host -ForegroundColor DarkGray "Deleting old topology"
Remove-SPEnterpriseSearchTopology -Identity $InitialSearchTopology
-Confirm:$false
Write-Host -ForegroundColor Yellow "Old topology deleted"

#==============================================================
#Check Search Topology
#==============================================================
Get-SPEnterpriseSearchStatus -SearchApplication $SearchServiceApplication -Text
Write-Host -ForegroundColor Yellow "Search Service Application and Topology
is configured!!"
$Server02 = (Get-SPServer "2013-SPHost2").Name
$EnterpriseSearchserviceApplication = Get-SPEnterpriseSearchserviceApplication
$ActiveTopology = $EnterpriseSearchserviceApplication.ActiveTopology.Clone()
$IndexComponent =(New-Object Microsoft.Office.Server.Search.Administration.Topology.IndexComponent
$Server02,1);
$IndexComponent.RootDirectory = "D:\IndexComponent02"



# Server1 is the local server where the script is run.

For fault-tolerance we need to have at least two index components (replicas) for an index partition. Here I create 4 index partition with 8 index components. One index partition can serve up to 10 million items. As a good practice, the primary and secondary replicas should be balanced among the index servers. So we will have Server1 hosting 4 index component (out of which 2 will be primary replicas) and Server2 hosting other 4 index components (2 primary replicas).



* Please note that above cmdlets will not create the primary replicas in the server we want as expected as we are running all the cmdlets at same time without saving the topology. Ideally we should create an index partition in one server and then run Set-SPEnterpriseSearchTopology. This will ensure that the primary replica is created in the server we want. The next time you run the same cmdlet in another server for same index partition will create secondary replica. For more details - http://blogs.technet.com/b/speschka/archive/2012/12/02/adding-a-new-search-partition-and-replica-in-sharepoint-2013.aspx

When the above script is run one after the other to create multiple index partitions and replicas in different servers, you can see in the picture below there is no particular order for creation of replicas in the servers. The Primary and secondary replicas are not created in the servers that we wanted. If you are concerned about primary index component server location, then you should set the topology before you run the cmdlet to create secondary replica in another server.






Now that all search components are created in our new topology. Before setting our new topology we need to activate this topology. Remember that we also have an old topology which is in active state.



We will use Set-SPEnterpriseSearchTopology cmdlet which does some important tasks - Activates the NewTopology [$NewSearchTopology.Activate()], deactivates all other active topologies and sets the NewTopology(Active) as the current Enterprise Search Topology

#==============================================================
   #Setting Search Topology using Set-SPEnterpriseSearchTopology
 #==============================================================
 Set-SPEnterpriseSearchTopology -Identity $NewSearchTopology

After running Set-SPEnterpriseSearchTopology cmdlet, it will look like



As Set-SPEnterpriseSearchTopology has already done most of the job for us we will do one last thing - delete the old topology as its no longer required.

#==============================================================
                 #Clean-Up Operation
 #==============================================================
 Write-Host -ForegroundColor DarkGray "Deleting old topology"
 Remove-SPEnterpriseSearchTopology -Identity $InitialSearchTopology
 -Confirm:$false
 Write-Host -ForegroundColor Yellow "Old topology deleted"


When $SearchServiceApplication | Get-SPEnterpriseSearchTopology cmdlet is run, you will find just one topology (new topology that we created)



#==============================================================
                 #Check Search Topology
 #==============================================================
 Get-SPEnterpriseSearchStatus -SearchApplication $SearchServiceApplication -Text
 Write-Host -ForegroundColor Yellow "Search Service Application and Topology
 is configured!!"

In Central administration, Search service application you will find topology like this;



In your environment to know the numbers for each search components, use this scaling guidelines.



* The New-SPEnterpriseSearchIndexComponent requires folder for storing index files. In multiple server search configuration scenario, New-SPEnterpriseSearchIndexComponent checks the existence of RootDirectory in the wrong server. It checks the existence of the folder only in the machine where PowerShell script is executed; even in those cases when new index component is scheduled for other machine. You will get an error message, New-SPEnterpriseSearchIndexComponent : Cannot bind parameter 'RootDirectory'

There are 2 workarounds for this;

1. Create the folder manually in the machine that runs powershell script as its done in the aforementioned script.

2. Use directly the SP Object model instead of cmdlets.

$Server02 = (Get-SPServer "2013-SPHost2").Name
 $EnterpriseSearchserviceApplication  = Get-SPEnterpriseSearchserviceApplication
 $ActiveTopology = $EnterpriseSearchserviceApplication.ActiveTopology.Clone()
 $IndexComponent =(New-Object Microsoft.Office.Server.Search.Administration.Topology.IndexComponent
 $Server02,1);
 $IndexComponent.RootDirectory = "D:\IndexComponent02"
 $ActiveTopology.AddComponent($IndexComponent)

* A note on removing an index component - If you have more than one active index replica for an index partition, you can remove an index replica by performing the procedure Remove a search component in the article Manage search components in SharePoint Server 2013. You cannot remove the last index replica of an index partition using this procedure. If you have to remove all index replicas from the search topology, you must remove and re-create the Search service application and create a completely new search topology that has the reduced number of index partitions.





Tuesday, October 14, 2014

Get List item values in ItemAdding event handler

In an Item Adding event handler Properties.ListItem will be null. Same is the case with Properties.BeforeProperties & Properties.AfterProperties too. So how to get the values entered in the NewForm.aspx in the ItemAdding method. Here we go..
You can access the value using AfterProperties property. But here you have to pass the internal name of the column whose value you want to access.
properties.AfterProperties["Internal Name of the column"]
Or in case if you don’t know/want to explicitly specify the internal name you can get the same using Field.InternalName property. The below code snippet I used explains the same.
string employeeInternalName = Convert.ToString(site.Lists[properties.ListId].Fields["EmployeeName"].InternalName);
string employeeName= Convert.ToString(properties.AfterProperties[employeeInternalName]);

Monday, October 13, 2014

Verify database upgrades in SharePoint 2013



Verify upgrade status for databases

You can use the following methods to verify upgrade:
  • Use the Upgrade Status page in Central Administration
    This page lists all farm, service, or content database upgrades and their statuses. This includes a count of errors or warnings.
  • Review the log files to look for errors or warnings
    If upgrade was not successfully completed, you can view the log files to find the issues, address them, and then restart the upgrade process.

Review the log files for database attach upgrade

To verify that upgrade has succeeded, you can review the following log and error files:
  • The upgrade log file and the upgrade error log file.
    Review the upgrade log file and the upgrade error log file (generated when you run the upgrade). The upgrade log file and the upgrade error log file are located at %COMMONPROGRAMFILES%\Microsoft Shared\Web server extensions\15\LOGS. The logs are named in the following format: Upgrade-YYYYMMDD-HHMMSS-SSS.log, where YYYYMMDD is the date and HHMMSS-SSS is the time (hours in 24-hour clock format, minutes, seconds, and milliseconds). The upgrade error log file combines all errors and warnings in a shorter file and is named Upgrade-YYYYMMDD-HHMMSS-SSS-error.log.

Check upgrade status for databases

The Upgrade Status page lists the upgrade sessions and gives details about the status of each session — whether it succeeded or failed, and how many errors or warnings occurred for each server. The Upgrade Status page also includes information about the log and error files for the upgrade process and suggests remedies for issues that might have occurred.
To view upgrade status in SharePoint Central Administration
  1. Verify that you have the following administrative credentials:
    • To use SharePoint Central Administration, you must be a member of the Farm Administrators group.
  2. On the Central Administration home page, in the Upgrade and Migration section, click Check upgrade status.

Validate the upgraded environment

After you determine whether upgrade was completed successfully, validate your environment. Review the following items:
  • Service applications
    • Are they configured correctly?
    • Are the service application proxies configured the way that we want?
    • Do we have to create new connections between farms?
  • Site collections
    • Are sites that were not upgraded working as expected in 2010 mode?
    • Are all features associated with the sites working?
  • Search
    • Run a crawl, and review the log files.
    • Run search queries, and verify that the queries work as expected and provide appropriate results. Twenty-four hours later, view the query reports and look for issues.
    • Search for people and profiles.
    • Check any Search customizations to make sure that they work as expected.

User Access issue after upgrading the SP 2010 site to SP 2013

Users coming from a SharePoint 2010 system that try to access SharePoint 2013 after a migration receive a “this site has not been shared with you” message. This mean that they are not able to authenticate to SharePoint 2013.

In SharePoint 2013 there is a new authentication mechanism called Claim based authentication. Be default through the UI all Applications are created in this mode.

I created a PowerShell script that loops through all of your SharePoint 2013 web applications and upgrades each one to claim’s based authentication.

Script:
 Param(
    [string]  $account = $(Read-Host -prompt "UserAccount")
    )
Add-PSSnapIn Microsoft.SharePoint.PowerShell

foreach ($wa in get-SPWebApplication)
{
    Write-Host "$($wa.Name) | $($wa.UseClaimsAuthentication )"
    #http://technet.microsoft.com/en-us/library/gg251985.aspx
    $wa.UseClaimsAuthentication = $true
    $wa.Update()
    $account = (New-SPClaimsPrincipal -identity $account -identitytype 1).ToEncodedString()
    $zp = $wa.ZonePolicies("Default")
    $p = $zp.Add($account,"PSPolicy")
    $fc=$wa.PolicyRoles.GetSpecialRole("FullControl")
    $p.PolicyRoleBindings.Add($fc)
    $wa.Update()
    $wa.MigrateUsers($true)
    $wa.ProvisionGlobally()
}

For more information, refer the msdn blog: http://technet.microsoft.com/en-us/library/gg251985.aspx

Sunday, September 28, 2014

Configure Crawl Sources in sharepoint 2013

How to start a full crawl in Central Administration
Before you can start a full crawl in Central Administration, you have to specify which content source should be crawled. When you run a full crawl, all content in the content source is crawled even if that content has already been added to the search index.
For this scenario, we'll crawl the Local SharePoint sites content source.
  1. Go to Central Administration --> Manage service applications --> Search Service Application -- > Content Sources.
  2. On the Manage Content Sources page, hover over the Local SharePoint sites content source, and select Start Full Crawl from the menu.

The status of the crawl is shown in the Status column.
  1. Refresh this page until you see that the value in the Status column is Idle.
    This means that the crawl has finished.

  1. Optionally, you can verify that your items have been added to the search index by clicking Crawl Log.
    In our scenario, we now have 870 items in the search index, which is approximately the same amount of products we have in the Products list.

How to enable continuous crawls in Central Administration
You can only start a full crawl manually. Nobody wants the hassle of having to manually start a crawl every time a change is made to their catalog content, as this is neither an efficient nor practical way to work. So, to avoid this overhead, you can simply enable a continuous crawl of your content source that contains the catalog.
Continuous crawls start automatically at set intervals. Any changes that have been made to the catalog since the previous crawl, are picked up by the crawler and added to the search index.
To enable continuous crawls:
  1. Go to Central Administration --> Manage service applications --> Search Service Application --> Content Sources.
  2. On the Manage Content Sources page, click Your content source for which you want to enable continuous crawl, in our scenario case, this is Local SharePoint sites.
  3. Select the option Enable Continuous Crawls.

How to set continuous crawl interval
The default interval for continuous crawls is 15 minutes. You can set shorter intervals by using PowerShell. The code snippet below sets the continuous crawl interval to 1 minute.
$ssa = Get-SPEnterpriseSearchServiceApplication
$ssa.SetProperty("ContinuousCrawlInterval", 1)
So, by enabling continuous crawls, you can avoid a lot of frustration from content managers as they no longer have to wait for Search service application administrators to start a crawl for them. However, for some catalog changes, for example, enabling managed properties as refiners, continuous crawls are not sufficient, and you will need to do a full reindexing of the catalog content. But not to worry. Content managers have no reason for concern, because there is a way for them to initiate a full reindexing of the catalog.
How to initiate a reindexing of the catalog
To mark a catalog for reindexing, here's what to do:
  1. On your catalog (in our case the Products list in the Product Catalog Site Collection), click the LIST tab --> List Settings --> Advanced Settings.
  2. On the Advanced Settings page, click Reindex List.

How to view crawl status and schedule for a catalog
You can view the crawl status and schedule for an individual catalog. To do this:
  1. On your catalog (in our case the Products list in the Product Catalog Site Collection), click the LIST tab --> List Settings --> Catalog Settings.
  2. On the Catalog Settings page, you can see when the catalog was last crawled, and what crawls are scheduled to run when.
    In our case, we can see that the catalog was last crawled on 3/4/2013 at 5:30:17 AM, and that continuous crawls are scheduled to run every 15 minutes.

So, all in all, content managers can be happy because their content is added to the search index at short intervals, and Search service application administrators can be happy because they are no longer bothered by content managers constantly asking them to start a crawl.

Upgrading the web.config file to SharePoint 2013

In SharePoint 2010 if we have any of the custom CAS policy defined and which is also referred in web.config file, then on upgrading the site from 2010 to SharePoint 2013, that entry will be missing in the web.config file.

<TrustLevel node is not available in web.config by default in SP2013, we need to update the code to insert the additional node before adding the custom CAS policy.

Powershelll scripts for Feature activation

Enable-SPFeature -identity "<Feature ID>" -URL http://sharepointsite

Configure Crawl Rules in Sharepoint 2013

To create or edit a crawl rule
      1. Verify that the user account that is performing this procedure is an administrator for the Search service application.
      2. In Central Administration, in the Application Management section, click Manage Service Applications.
      3. On the Manage Service Applications page, in the list of service applications, click the Search service application.
      4. On the Search Administration page, in the Crawling section, click Crawl Rules. The Manage Crawl Rules page appears.
      5. To create a new crawl rule, click New Crawl Rule. To edit an existing crawl rule, in the list of crawl rules, point to the name of the crawl rule that you want to edit, click the arrow that appears, and then click Edit.
      6. On the Add Crawl Rule page, in the Path section:
  •       In the Path box, type the path to which the crawl rule will apply. You can use standard wildcard characters in the path.
  •        To use regular expressions instead of wildcard characters, select Use regular expression syntax for matching this rule.
7. In the Crawl Configuration section, select one of the following options:
     
      I.  Exclude all items in this path. Select this option if you want to exclude all items in the specified path from crawls. If you select this option, you can refine the exclusion by selecting the following:

Exclude complex URLs (URLs that contain question marks
Select this option if you want to exclude URLs that contain parameters that use the   question mark (?) notation.

      II. Include all items in this path. Select this option if you want all items in the path to be crawled. If you select this option, you can further refine the inclusion by selecting any combination of the following:

Follow links on the URL without crawling the URL itself. Select this option if you want to crawl links contained within the URL, but not the starting URL itself.

Crawl complex URLs (URLs that contain a question mark (?)). Select this option if you want to crawl URLs that contain parameters that use the question mark (?) notation.

Crawl SharePoint content as http pages. Normally, SharePoint sites are crawled by using a special protocol. Select this option if you want SharePoint sites to be crawled as HTTP pages instead. When the content is crawled by using the HTTP protocol, item permissions are not stored.

      8. In the Specify Authentication section, perform one of the following actions:

  •     To use the default content access account, select Use the default content access account.
  •     If you want to use a different account, select Specify a different content access account and then perform the following actions:
1.  In the Account box, type the user account name that can access the paths that are defined in this crawl rule.
2.  In the Password and Confirm Password boxes, type the password for this user account.
3.  To prevent basic authentication from being used, select the Do not allow Basic Authentication check box. The server attempts to use NTLM authentication. If NTLM authentication fails, the server attempts to use basic authentication unless the Do not allow Basic Authentication check box is selected.

  •     To use a client certificate for authentication, select Specify client certificate, expand the Certificate menu, and then select a certificate.
  •     To use form credentials for authentication, select Specify form credentials, type the form URL (the location of the page that accepts credentials information) in the Form URL box, and then click Enter Credentials. When the logon prompt from the remote server opens in a new window, type the form credentials with which you want to log on. You are prompted if the logon was successful. If the logon was successful, the credentials that are required for authentication are stored on the remote site.
  •     To use cookies, select Use cookie for crawling, and then select either of the following options:
1.  Obtain cookie from a URL. Select this option to obtain a cookie from a website or server.
2.  Specify cookie for crawling. Select this option to import a cookie from your local file system or a file share. You can optionally specify error pages in the Error pages (semi-colon delimited) box.

  •     To allow anonymous access, select Anonymous access.
              Click OK.

To test a crawl rule on a URL
1.      Verify that the user account that is performing this procedure is an administrator for the Search service application.
2.      In Central Administration, in the Application Management section, click Manage Service Applications.
3.      On the Manage Service Applications page, in the list of service applications, click the Search service application.
4.      On the Search Administration page, in the Crawling section, click Crawl Rules.
5.      On the Manage Crawl Rules page, in the Type a URL and click test to find out if it matches a rule box, type the URL that you want to test.
6.      Click Test. The result of the test appears below the Type a URL and click test to find out if it matches a rule box.

To delete a crawl rule
1.      Verify that the user account that is performing this procedure is an administrator for the Search service application.
2.      In Central Administration, in the Application Management section, click Manage Service Applications.
3.      On the Manage Service Applications page, in the list of service applications, click the Search service application.
4.      On the Search Administration page, in the Crawling section, click Crawl Rules.
5.      On the Manage Crawl Rules page, in the list of crawl rules, point to the name of the crawl rule that you want to delete, click the arrow that appears, and then click Delete.
6.      Click OK to confirm that you want to delete this crawl rule.

To reorder crawl rules
1.      Verify that the user account that is performing this procedure is an administrator for the Search service application.
2.      In Central Administration, in the Application Management section, click Manage Service Applications.
3.      On the Manage Service Applications page, in the list of service applications, click the Search service application.
4.      On the Search Administration page, in the Crawling section, click Crawl Rules.

5.      On the Manage Crawl Rules page, in the list of crawl rules, in the Order column, specify the crawl rule position that you want the rule to occupy. Other values shift accordingly.

Configure Best Bets in SharePoint


SharePoint 2013 Preview transforms all your old Search Keywords or Best Bets  into Query Rules .

So let’s create a Query Rule that fires on the exact query ‘image library’ or ‘picture library’, then promotes a result for the Image Library to the top of the page.

First, we’ll go to the Query Rules management page. On your search center’s upper-right-hand corner, click the gear icon, then select Site Settings.


Next, on the Site Settings page, under the Search heading, click Query Rules. Note that you may see a Search Query Rules link under Site Collection Administration. This happens if you’re the Site Collection administrator and the search center is the site collection’s root site. Don’t click that one; those Query Rules affect every site in the site collection, and for now we want to focus only on the search center site.


Now that you’re on the Query Rules page, the first question to ask is “Where will the user be?” For example, do you want to manage Query Rules for your main Enterprise Search? Or for People Search? Or Video Search? Each search experience, out-of-the-box or custom, can have its own Query Rules.

This is what we call the query’s context. You configure Query Rules for a particular context by using that first row of dropdowns in the Query Rules management page.


To manage Query Rules for a specific search experience, use the first dropdown to pick the Result Source for that experience. We’ll go into Result Sources in another post — for now, think of them as a SharePoint 2010 Federated Location plus a Search Scope. Each search experience sends queries to a Result Source, and that source guarantees results meeting certain conditions. For instance, People Search sends queries to the Local People Results source, which only returns People results.

We want our new Query Rule to fire on the main Enterprise Search. That search experience sends queries to the Local SharePoint Results source (which includes everything SharePoint crawls except People). So choose Local SharePoint Results from the first dropdown.

Next, click Add Rule to start creating your new rule.


Having picked a context, we just need to give the rule a name, then specify its conditions and actions. In other words, say when this rule will fire, and what it will do when it does. This is very similar to creating a Search Keyword in SharePoint 2010:

    1.  Give the rule a name: Image Library.
     2.  In the Query Conditions section, leave the condition type on “Query Matches Keyword Exactly”. In the textbox, type the queries we want to match, separated by semicolons: image library; picture library.
     3. In the Actions section, since we want to promote a result to the top of the page, click Add Promoted Result. These are just like Best Bets in SharePoint 2010.


4. In the Add Promoted Result dialog, fill out the title, URL, and description.


5. Click Save in the dialog, then Save in the Add Query Rule page.
And that’s it…you’ve created a Query Rule! To try it out, go to your search center and search for ‘image library’ or ‘picture library’ (note that it can take a few seconds for the Query Rule to start working).



This Query Rule, while simple, demonstrates the high-level steps for creating all Query Rules.
  1.      Pick the context (e.g., queries sent to the Local SharePoint Results source).
  2.      Specify the conditions (e.g., fire if the query exactly matches ‘image library’ or ‘picture library’).
  3.      Specify the actions (e.g., promote a result for the Image Library).