Quantcast
Channel: SharePoint Escalation Services Team Blog
Viewing all 170 articles
Browse latest View live

CRL verification Failure –SharePoint

$
0
0

 

Couple of weeks back, I had a chance to work with one of the customer who reported multiple issues in his farm. These issues were followed by a farm wide outage and he ended up contacting Microsoft. I got involved with this customer as part of the escalation process and following were his major concerns after the initial discussion I had with him.

1. Site slowness after every IIS Reset. The sites takes minutes to load this is been the issues from the day they setup this production farm.

2. On couple of servers in the farm the configuration cache is not getting refreshed and the OWSTimer is not working correctly. After clearing the configuration cache, if restarts the timer-service none of the files are created back by the timer.

3. Because of the issues with the configuration cache, they were forced remove one of the server from the NLB (Network Load Balancer). Unfortunately one server left in the NLB could not handle the user load during the business hours and this ended up with a farm wide outage.

Our primary objective here was to bring back the server which was moved out of NLB in working condition. We started looking at the timer job issue on the broken server. As always our starting point to dig further in to this issue was ULS logs. We enabled verbose logging and started looking at the ULS logs. Unfortunately the ULS logs were not helpful as the tracing service was not able to log much about the OWSTIMER as it was in stuck state. Basically nothing other than the following were getting logged in the ULS.

OWSTIMER.EXE (0x08E0)        0x06A4        ULS Logging        Unified Logging Service        8pbc        Verbose WatsonRC is currently unavailable. Error Reporting will still work properly.

OWSTIMER.EXE (0x08E0)        0x06A4        ULS Logging        Unified Logging Service        8wsv        High        ULS Init Completed (OWSTIMER.EXE, ONETNA~1.DLL)

OWSTIMER.EXE (0x08E0)        0x06A4        Windows SharePoint Services        Timer        6gey        Verbose        Entered VtimerStore::HrInitTimer       

OWSTIMER.EXE (0x08E0)        0x06A4        Windows SharePoint Services        Timer        6gfa        Verbose        Entered VtimerStore::HrGetISPNativeConfigurationProvider       

OWSTIMER.EXE (0x08E0)        0x0C6C        ULS Logging        Unified Logging Service        7a8f        Verbose        Baseboard Manufacturer: 'Intel Corporation'

We tried to stop the OWSTIMER from service manager and we get the following error after a long wait period. So we had to always kill the respective process from the task manager.

clip_image001

We went back and looked at the configuration cache folder and found that couples of XML files are created in the last 30-45 minutes. This was interesting because it kind of proved us that files are getting created but either they are very slow or there could be a possibility that the some of the objects which in the configuration database (Objects Table) is not in good shape.

Since nothing from the ULS log was helping us to isolate between the object corruption and slowness, we decided to get a memory dump of the OWSTIMER.EXE as soon as we start it on the non-working server. I got couple of memory dumps and we ended up finding the following from the analysis:

ntdll!ZwWaitForSingleObject+0xa

kernel32!WaitForSingleObjectEx+0x9c

cryptnet!CryptRetrieveObjectByUrlWithTimeout+0x263

cryptnet!CryptRetrieveObjectByUrlW+0x20c

crypt32!ChainRetrieveObjectByUrlW+0x81

crypt32!CCertChainEngine::RetrieveCrossCertUrl+0x135

crypt32!CCertChainEngine::UpdateCrossCerts+0x196

crypt32!CCertChainEngine::Resync+0x1e8

crypt32!CCertChainEngine::CreateChainContextFromPathGraph+0x239

crypt32!CCertChainEngine::GetChainContext+0x8b

crypt32!CertGetCertificateChain+0x100

Most of the threads in OWSTIMER were checking the certificate revocation lists (CRLs) to verify that the signature is still valid. The problem here when loading signed assemblies, the .Net Framework checks the Internet based Certificate Revocation List to validate the signatures of the DLL that’s loaded. This kind of explained why they might be facing a slow page load after every IIS Reset. We confirmed this is the same root cause for the page load issue as well by getting a memory dump of w3wp process right after an IIS Reset.

So, what is going on here? 

The problem is that when loading signed assemblies the .net Framework checks the Internet based Certificate Revocation List. As our servers have, like most secure environments,  no outgoing connections to the public Internet, the connection to crl.microsoft.com times out after what appears to be 30 seconds. It probably does this a couple of times in succession, causing a delay in couple of minutes & we wait when spinning up SharePoint.

After the timeout the assembly is still loaded and the software works as expected, though very slow every time a new signed assembly is loaded for the first time, which happens a lot. The worst thing is that no entries are written to the event log and no exceptions are thrown so you are left completely in the dark about why your application is so slow.

The Workaround here is this configuration setting to disable signature verification in a .NET Framework 2.0 managed application. You can use this configuration setting in an application configuration file. To do this, add the following code to the <ApplicationName>.exe.config file for the .NET Framework 2.0 managed application,

For the timer service, we created a configuration file “owstimer.exe.config" with in bin folder and disabled the generatePublisherEvidence .

<configuration>
        <runtime>
                <generatePublisherEvidence enabled="false"/>
        </runtime>
</configuration>

We restarted timer service and the configuration cache building completed in few seconds. The Sites started loading in few seconds after every IISReset. For more details about the Fix , please refer to http://support.microsoft.com/kb/936707/en-us .

For the site slowness after IISReset, we disabled CRL check on all the servers at ASP.net Level using same configuration in “aspnet.config" file.

Quick Ways to find out that CRL check failure are

1. Collect a Fiddler trace on the Server and you would see the requests going to Crl.microsoft.com but no response is obtained.

2. Collect a Netmon Trace on the Server to see either the Name resolution failing for Crl.microsoft.com or no response for request to Crl.microsoft.com .

Here is a screenshot of a Fiddler Capture on SharePoint server which does not have access to internet & attempt was made to run SharePoint Management Shell, Stsadm Command & Launch Central Admin from Start Menu .

clip_image003

Here is a sample network captures taken on a SharePoint sever which is trying to access clr.microsoft.com site to check the CRL. To do so we try resolving the names crl.microsoft.com and the name resolution looks good.

312 16:35:59 13-08-2012 10.30.30.154 SERVER1 <00> DNS DNS:QueryId = 0xA724, QUERY (Standard query), Query for crl.microsoft.com of type Host Addr on class Internet

315 16:35:59 13-08-2012 SERVER1 <00> 10.30.30.154 DNS DNS:QueryId = 0xA724, QUERY (Standard query), Response - Success, 63.84.95.48, 63.84.95.73 ...

 

We then try to establish a TCP session on port 80 with the IP address 63.84.95.48 and we fail to establish the session.

317 16:35:59 13-08-2012 Owstimer.exe 10.30.30.154 63.84.95.48 TCP TCP:Flags=......S., SrcPort=53053, DstPort=HTTP(80), PayloadLen=0, Seq=339758504, Ack=0, Win=8192 ( ) = 8192

554 16:36:02 13-08-2012 TmProxy.exe 10.30.30.154 63.84.95.48 TCP TCP:[SynReTransmit #317]Flags=......S., SrcPort=53053, DstPort=HTTP(80), PayloadLen=0, Seq=339758504, Ack=0, Win=8192 ( ) = 8192

685 16:36:06 13-08-2012 TmProxy.exe 63.84.95.48 10.30.30.154 TCP TCP:Flags=...A.R.., SrcPort=HTTP(80), DstPort=53053, PayloadLen=0, Seq=0, Ack=339758505, Win=8192

 

Post By :Sojesh Sreelayam [MSFT]


SharePoint 2010: Issues with Service Application Proxy & Proxy Group Associations

$
0
0

 

 

At times we would see errors while accessing various Service applications. These types of issues manifest with various symptoms at different locations, here is an example of a few I have come across.

Symptom 1:

When you try to access the Managed Metadata Service Application from manage Service Applications, you face the following error:

"The Service Application being requested does not have a connection associated with the Central Administration web application. To access the term management tool use the site settings from a site configured with the appropriate connection"

Symptom 2:

While trying to add the BCS catalog at the External List on a site or trying to add the BCS catalog while creating a sync connection for a BCS source in User profile Service application, we get the following error,

"The Business data connectivity metadata store is unavailable. Check configuration and try again"

Symptom 3:

While BCS sync step is run during an UPA profile sync, we might see this error in ULS logs

05/18/2012 01:00:39.14 miiserver.exe (0x0C1C) 0x23C8 SharePoint Portal Server User Profiles e96j High Error calling FindSpecific : Microsoft.BusinessData.Infrastructure.BdcException: The shim execution failed unexpectedly - Unable to obtain the application proxy for the context.. ---> Microsoft.Office.SecureStoreService.Server.SecureStoreServiceException: Unable to obtain the application proxy for the context.

at Microsoft.Office.SecureStoreService.Server.SecureStoreProvider.get_Proxy()

at Microsoft.Office.SecureStoreService.Server.SecureStoreProvider.GetRestrictedCredentials(String appId)

at Microsoft.SharePoint.BusinessData.Infrastructure.WindowsAuthenticator.ExecuteAfterLogonUser(Object[] args, ISecureStoreProvider ssoProvider, String ssoApplicationId, Boolean useSensitiveSsoCreds)

at Microsoft.SharePoint.BusinessData.Infrastructure.WindowsAuthenticator.ExecuteAfterLogonUser(Object[] args, ISecureStoreProvider ssoProvider, String ssoApplicationId)

at Microsoft.SharePoint.BusinessData.SystemSpecific.Db.DbConnectionManager.GetConnection()

Possible causes:

1. You might be missing an Association between web-application & service applications via the Service applications Proxy Group.

2. You might have in-sufficient permissions to Service applications/Service application proxies for various service accounts .

Resolution:

1. Create a new proxy group with all required all Service applications & then move the required web-applications to use this Proxy Group.

2.  Check & assign the permissions to Service applications/Service application proxies for required service accounts.

 

Here is a Sample commands to create a Proxy Group via SharePoint Management shell.Note : Although this can be done via UI , but with a Limitation of unable to create name of the proxy group .

============================================================

$GroupName = 'New Custom Service Group'

Remove-SPServiceApplicationProxyGroup -Identity $groupName -Confirm:$false -ErrorAction SilentlyContinue

$newProxyGroup = New-SPServiceApplicationProxyGroup -name $GroupName

$prxy=Get-SPServiceApplicationProxy

Add-SPServiceApplicationProxyGroupMember -identity $newProxyGroup -member $Prxy

=============================================================

Post By: Rajan Kapoor [MSFT]

Disclaimer
By using the following materials or sample code you agree to be bound by the license terms below and the Microsoft Partner Program Agreement the terms of which are incorporated herein by this reference. These license terms are an agreement between Microsoft Corporation (or, if applicable based on where you are located, one of its affiliates) and you. Any materials (other than sample code) we provide to you are for your internal use only. Any sample code is provided for the purpose of illustration only and is not intended to be used in a production environment. We grant you a nonexclusive, royalty-free right to use and modify the sample code and to reproduce and distribute the object code form of the sample code, provided that you agree:  to not use Microsoft's name, logo, or trademarks to market your software product in which the sample code is embedded; (ii) to include a valid copyright notice on your software product in which the sample code is embedded; (iii) to provide on behalf of and for the benefit of your subcontractors a disclaimer of warranties, exclusion of liability for indirect and consequential damages and a reasonable limitation of liability; and (iv) to indemnify, hold harmless, and defend Microsoft, its affiliates and suppliers from and against any third party claims or lawsuits, including attorneys' fees, that arise or result from the use or distribution of the sample.

SharePoint 2007: SSP upgrade issue with SP3

$
0
0

 

The Office Server Search gatherer crawled documents in batches. It creates a batch ID each time it loads documents from the crawl queue into SQL, these BatchIDs are stored in MSSBatchHistory Table (part of SSP Search DB). There was a limitation that existed in Search where in it does not clean up the MSSBatchHistory table upon crawl completion. As a result, MSSBatchHistory table continues to grow till one day BatchId (as defined as Identity type column) reaches the limit of Int described here , which would cause a failure in Search.

In-order to overcome this situation, Microsoft released a fix to

a) Change the BatchID column type from Int to a BigInt

b) Delete rows from MSSBatchHistory at end of each crawl

 

This fix was released in April 2011 CU (12.0.6557.5001) as described in the following KB articles,

For Windows SharePoint Services 3.0: KB2512780

For Office SharePoint Server 2007: KB2512781

So what's the Problem?

If you are upgrading to the April 2011 CU or Service Pack 3 for WSS 3.0/SharePoint 2007 coming from an older build (say Pre-April 2011 CU), this fix would get applied & as a part of the upgrade it will change the BatchID column from Int to BigInt.

This process can take a lot of time (many hours) depending upon the number of records in the MSSBatchHistory table & will also grow the transaction log for SSP Search database considerably.

Hence it is recommended that you plan this upgrade and monitor it to completion,

Steps to monitor & plan for the upgrade:

1. You can run the following SQL command on the Content DB to look at the Size & number of rows in the table & the output will look like this.

Sp_spaceUsed 'MSSBatchHistory'

Name

Rows

reserved

data

index_size

unused

MSSBatchHistory

262021461

6807152 KB

6805776 KB

112 KB

1264 KB

2. This is what the upgrade logs generated during the PSConfig execution will show,

[SearchDatabaseSequence] [DEBUG] [12/18/2011 12:11:53 PM]: Calling get SchemaVersion on Database BPA_SSP_Search, Status = Upgrading.

[SPManager] [DEBUG] [12/18/2011 12:11:53 PM]: [SearchSharedDatabase Name=SSP Name Parent=SPDatabaseServiceInstance] Running 1 of 1 steps

[SearchQFE21038DatabaseAction] [12.2.508.0] [DEBUG] [12/18/2011 12:11:53 PM]: Begin Initialize()

[SearchQFE21038DatabaseAction] [12.2.508.0] [DEBUG] [12/18/2011 12:11:53 PM]: End Initialize()

[SearchQFE21038DatabaseAction] [12.2.508.0] [DEBUG] [12/18/2011 12:11:53 PM]: SearchQFE21038DatabaseAction.ShouldRun returns true

[SearchQFE21038DatabaseAction] [12.2.508.0] [DEBUG] [12/18/2011 12:11:53 PM]: Begin Upgrade()

[SearchQFE21038DatabaseAction] [12.2.508.0] [DEBUG] [12/18/2011 12:11:53 PM]: SearchQFE21038DatabaseAction.Upgrade: Changing MSSBatchHistory column BatchID type to bigint on database SQLServrName:SSP_searchDBName

3. Depending on the size of the 'MSSBatchHistory' table and number of rows, plan sufficient downtime.

4. Ensure the SSP Search DB is in Simple Recovery mode. (This is default setting)

5. It is recommended to have sufficient amount of disk space on the drive having the transaction log for the SSP Search DB. You may want to move this to a drive with more space if required before the upgrade.

6. It is recommended to set the MAXDOP =1 on SQL server before upgrade, as suggested here http://blogs.msdn.com/b/rmeure/archive/2011/12/23/optimizing-sql-for-sharepoint.aspx

References:

Estimate how long the upgrade process will take and the amount of space needed (Office SharePoint Server)

http://technet.microsoft.com/en-us/library/cc262891(office.12).aspx

 

Post By:Rajan Kapoor [MSFT]

Moss 2007/2010: Current Storage Used by a site collection does not match Actual storage in Content DB.

$
0
0

MOSS 2007 and SharePoint 2010 keep a record of the total storage consumed by each Site collection; this is presented in Central Administration within the Site Collection Quotas and Locks page and also the Storage Space Allocation page within each Site collection in MOSS 2007. This figure is used by SharePoint to determine if a Site collection is exceeding its configured storage quota.

Both the "Space Used" value on the "Storage Space Allocation site (_layouts/storman.aspx), and the "Current storage used" value under "Site

Collection Quotas and Locks" in Central Administration get their value directly from the "DiskUsed" value in the Sites table. The value of the "DiskUsed" column is size in bytes.

Moss 2007:

clip_image002

 

Moss 2010 :

clip_image004

Occasionally this figure can be incorrect and may not accurately reflect the total amount of storage that is actually being consumed by the Site collection. If you ever encounter this scenario you can tell SharePoint to recalculate this figure by using the Windows PowerShell commands below, these use the RecalculateStorageUsed method as described in -

 http://msdn.microsoft.com/en-us/library/microsoft.sharepoint.spsite.recalculatestorageused.aspx.

Simply replace the highlighted URL with that of the Site collection that you want to recalculate and then either run the commands manually or save as a .ps1 file and execute as a script.

MOSS 2007

[void][system.reflection.assembly]::loadwithpartialname("Microsoft.SharePoint")

$URL = "http://Moss2007"

$Site = New-Object Microsoft.SharePoint.SPSite($URL)

$Site.RecalculateStorageUsed()

SharePoint 2010

$URL = "http://Sp2010"

$Site = Get-SPSite -identity $URL

$Site.RecalculateStorageUsed()

 

 

Disclaimer

=========

By using the following materials or sample code you agree to be bound by the license terms below and the Microsoft Partner Program Agreement the terms of which are incorporated herein by this reference. These license terms are an agreement between Microsoft Corporation (or, if applicable based on where you are located, one of its affiliates) and you. Any materials (other than sample code) we provide to you are for your internal use only. Any sample code is provided for the purpose of illustration only and is not intended to be used in a production environment. We grant you a nonexclusive, royalty-free right to use and modify the sample code and to reproduce and distribute the object code form of the sample code, provided that you agree:  to not use Microsoft's name, logo, or trademarks to market your software product in which the sample code is embedded; (ii) to include a valid copyright notice on your software product in which the sample code is embedded; (iii) to provide on behalf of and for the benefit of your subcontractors a disclaimer of warranties, exclusion of liability for indirect and consequential damages and a reasonable limitation of liability; and (iv) to indemnify, hold harmless, and defend Microsoft, its affiliates and suppliers from and against any third party claims or lawsuits, including attorneys' fees, that arise or result from the use or distribution of the sample.

 

POST  BY :Rajan Kapoor [MSFT]

Guidance for XslLink property with the XsltListViewWebPart

$
0
0

The XsltListViewWebPart as well as the DataFormWebPart are having the same base class BaseXsltDataWebPart  and work quite similar. While for the DataFormWebPart it is OK and intended to use the Xsl and XslLink property, in current SharePoint versions the it is strongly recommended not to use the Xsl and XslLink properties with the XsltListViewWebPart. Both these properties exist as they are inherited from DataFormWebPart

In fact the Xsl property is can’t be edited in the Web Browser as it is explicitly missing from GUI.

The reason to limit the Xsl customization for the XsltListViewWebPart is that this Web Part – that is used heavily over all different places – was optimized in terms of performance. Not only the Web Part itself but also the OOB XSL files (like main.xsl in the layouts folder) were optimized to perform well in the Browser. This is also the reason why the compiled Xsl is written into the Web Part cache – to optimize performance & that's why it is recommended not to make a change in the XsltListViewWebPart.

For the reason to have a custom look and feel a different Web Part exists that should be used instead   The DataFormWebPart. Basically DataFormWebPart would work as the compiled XSL is cached differently, but using the DataFormWebPart would mean you would need to manually replace each existing XsltListViewWebPart by touching the WebPartPage with SharePoint Designer.

To insert a DataFormWebPart in SharePoint Designer you would have to choose in the SharePoint Designer Editor “Insert – Data View – Empty Data View” click at “Click here to select a data source” then select the Source List and finally select in the Dropdown List at the right “Insert selected fields as…” the value “Multiple Item View”. After that you save the changes and can now continue the modification either in SharePoint Designer or the Web Browser interface.

Speaking about different alternatives for the XsltListViewWebPart,

To work around this behavior you need to stop using the XslLink property of the Web Part, as the caching is more or less bound to this property & this can impact the performance of a page on which the web-part exists . There exist other alternatives to let the XsltListViewWebPart use your XSL Transform.

Workaround 1: On can use the Xsl property instead of the XslLink property.

You would need to specify some inline XSL like this:

<Xsl>

<xsl:stylesheet <!-- namespaces removed to be better readable--> >

<xsl:include href="/SiteAssets/Custom.xsl"/>

</xsl:stylesheet>

</Xsl>

The Xsl property works here as it does not need to be cached – the XsltListViewWebPart treats the content of the Xsl property already as compiled Xsl. Again You need to use SharePoint Designer to set this property .

Workaround 2: Create a custom view per List and set the XslLink property on the View rather than the WebPart as you currently do, however you would need to place your XSL file in the default “_layouts/xsl” folder

Sample Code : Please note:

This source code is freeware and is provided on an "as is" basis without warranties of any kind, whether express or implied, including without limitation warranties that the code is free of defect, fit for a particular purpose or non-infringing. The entire risk as to the quality and performance of the code is with the end user.

Function CopyViewAndSetXslLink

{

param($Url, $ListName, $ViewName, $NewViewName, $XslFileName)

$web = Get-SPWeb $Url

$list = $web.Lists[$ListName]

$view = $list.Views[$ViewName]

$view = $view.Clone($NewViewName, $view.RowLimit, $view.Paged, $false)

$view.XslLink = $XslFileName

$view.Update()

}

CopyViewAndSetXslLink -Url "http://sp"

-ListName "Shared Documents"

-ViewName "All Items"

-NewViewName "Xsl"

-XslFileName "Custom.xsl"

In this example for the Document Library “Shared Documents” the existing view “All Documents” is duplicated and saved as custom view with the name “Xsl”. For this new view the XslLink property is set to “Custom.xsl” that is supposed to be stored in “_layouts/xsl” folder.

This view “Xsl” can now be selected in the corresponding XsltListViewWebParts that are bound to this List /Document Library .

The reason the caching is here no issue is because of the fact the XSL file is places in the Layouts folder. Files in this folder are cached centrally as those are supposed to be used by Views. As they are cached already neither the View nor the Web Part have to cache them.

So by following this approach you should get a slight performance increase as the XSL file is once cached centrally while in your current approach it is cached once per single Web Part.

 

Post By:Rajan Kapoor [MSFT]

 

 

Disclaimer

=========

By using the following materials or sample code you agree to be bound by the license terms below and the Microsoft Partner Program Agreement the terms of which are incorporated herein by this reference. These license terms are an agreement between Microsoft Corporation (or, if applicable based on where you are located, one of its affiliates) and you. Any materials (other than sample code) we provide to you are for your internal use only. Any sample code is provided for the purpose of illustration only and is not intended to be used in a production environment. We grant you a nonexclusive, royalty-free right to use and modify the sample code and to reproduce and distribute the object code form of the sample code, provided that you agree:  to not use Microsoft's name, logo, or trademarks to market your software product in which the sample code is embedded; (ii) to include a valid copyright notice on your software product in which the sample code is embedded; (iii) to provide on behalf of and for the benefit of your subcontractors a disclaimer of warranties, exclusion of liability for indirect and consequential damages and a reasonable limitation of liability; and (iv) to indemnify, hold harmless, and defend Microsoft, its affiliates and suppliers from and against any third party claims or lawsuits, including attorneys' fees, that arise or result from the use or distribution of the sample.

Proxy Configuration issues with UPA in SharePoint 2010 /2013

$
0
0

This post talks about various issues seen in User Profile Service application configuration due to Web Proxy configurations. As the User Profile Import using the FIM has not changed in SharePoint 2013 these would apply to SharePoint 2013 as well.

Issue 1: Unable to start the User Profile Synchronization Service, This fails after 3 retries.

You see event ID 3 & 234 are logged in Application Event logs on the server where Sync service is being provisioned

Log Name:      Application
Source:        ILM Web Service Configuration
Event ID:      234
Task Category: None
Level:         Warning
Computer:      ServerName.domain.com
Description:
ILM Certificate could not be created: netsh http error:netsh http add urlacl url=http://+:5726/ user=Domain\Spadmin sddl=D:(A;;GA;;;S-1-5-21-3995503830-178758855-2493544469-25442)

Log Name:      Application
Source:        Forefront Identity Manager
Date:           [Date and Time]
Event ID:      3
Level:         Error
Computer:     ServerName.Domain.com
Description:

.Net SqlClient Data Provider: System.Data.SqlClient.SqlException: HostId is not registered
   at Microsoft.ResourceManagement.Utilities.ExceptionManager.ThrowException(Exception exception)
   at Microsoft.ResourceManagement.Data.Exception.DataAccessExceptionManager.ThrowException(SqlException innerException)
   at Microsoft.ResourceManagement.Data.DataAccess.RetrieveWorkflowDataForHostActivator(Int16 hostId, Int16 pingIntervalSecs, Int32 activeHostedWorkflowDefinitionsSequenceNumber, Int16 workflowControlMessagesMaxPerMinute, Int16 requestRecoveryMaxPerMinute, Int16 requestCleanupMaxPerMinute, Boolean runRequestRecoveryScan, Boolean& doPolicyApplicationDispatch, ReadOnlyCollection`1& activeHostedWorkflowDefinitions, ReadOnlyCollection`1& workflowControlMessages, List`1& requestsToRedispatch)
   at Microsoft.ResourceManagement.Workflow.Hosting.HostActivator.RetrieveWorkflowDataForHostActivator()
   at Microsoft.ResourceManagement.Workflow.Hosting.HostActivator.ActivateHosts(Object source, ElapsedEventArgs e)

Log Name: Application
Source: Microsoft.ResourceManagement.ServiceHealthSource
Date:  [Date and Time]
Event ID: 22
Level: Error
Computer: ServerName.Domain.com

Description:
The Forefront Identity Manager Service cannot connect to the SQL Database Server.
The SQL Server could not be contacted. The connection failure may be due to a network failure, firewall configuration error, or other connection issue. Additionally, the SQL Server connection information could be configured incorrectly.
Verify that the SQL Server is reachable from the Forefront Identity Manager Service computer. Ensure that SQL Server is running, that the network connection is active, and that the firewall is configured properly. Last, verify the connection information has been configured properly. This configuration is stored in the Windows Registry.

 

 

Issue 2 : You browse to "Configure Synchronization Connections" page of the User Profile Service Application " & unable to see your Existing Sync connection . Additionally you try to create a New Sync connection & get an error

[Date and Time] PM w3wp.exe (0x1108) 0x1284 SharePoint Portal Server User Profiles d3b3 High LoadConnections failed trying to fill the connections list. Most likely during RetriveResources because of permissions --- {1}. Available parameters: System.ServiceModel.ProtocolException: The remote server returned an unexpected response: (407) Proxy Authentication Required. ---> System.Net.WebException: The remote server returned an error: (407) Proxy Authentication Required. at System.Net.HttpWebRequest.GetResponse() at System.ServiceModel.Channels.HttpChannelFactory.HttpRequestChannel.HttpChannelRequest.WaitForReply(TimeSpan timeout) --- End of inner exception stack trace --- Server stack trace:

at System.ServiceModel.Channels.HttpChannelUtilities.ValidateRequestReplyResponse(HttpWebRequest request, HttpWebResponse response, HttpChannelFactory factory, WebException. responseException, ChannelBinding channelBinding) at System.ServiceModel.Channels.HttpChannelFactory.HttpRequestChannel.HttpChannelRequest.WaitForReply(TimeSpan timeout) at System.ServiceModel.Channels.RequestChannel.Request(Message message, TimeSpan timeout)
at System.ServiceModel.Dispatcher.RequestChannelBinder.Request(Message message, TimeSpan timeout)
at System.ServiceModel.Channels.ServiceChannel.Call(String action, Boolean oneway, ProxyOperationRuntime operation, Object[] ins, Object[] outs, TimeSpan timeout)
at System.ServiceModel.Channels.ServiceChannelProxy.InvokeService(IMethodCallMessage methodCall, ProxyOperationRuntime operation) at System.ServiceModel.Channels.ServiceChannelProxy.Invoke(IMessage message) Exception rethrown at [0]: at System.Runtime.Remoting.Proxies.RealProxy.HandleReturnMessage(IMessage reqMsg, IMessage retMsg)
at System.Runtime.Remoting.Proxies.RealProxy.PrivateInvoke(MessageData& msgData, Int32 type)
at System.ServiceModel.Description.IMetadataExchange.Get(Message request)
at Microsoft.ResourceManagement.WebServices.MetadataClient.Get(String dialect, String identifier)
at Microsoft.ResourceManagement.WebServices.Client.ResourceManagementClient.SchemaManagerImplementation.RefreshSchema
at Microsoft.ResourceManagement.WebServices.ResourceManager.get_SchemaManager()
at Microsoft.ResourceManagement.WebServices.ResourceManager..ctor(String typeName, LocaleAwareClientHelper localePreferences, ContextualSecurityToken securityToken)
at Microsoft.Office.Server.UserProfiles.ConnectionManager.LoadConnections(Boolean fForUI) . bdb5f1ce-ab18-48d7-9374-78df992d5a0b

[Date and Time] w3wp.exe (0x1108) 0x1284 SharePoint Portal Server User Profiles a3xu High ConnectionManager.LoadConnections(): Could not find MOSS MA despite being marked as fully configured, was it deleted? bdb5f1ce-ab18-48d7-9374-78df992d5a0b

 

 

Issue 3: The Moss_export Step fails, following Error is seen in the ULS logs

System.NullReferenceException: Object reference not set to an instance of an object.

   at Microsoft.Office.Server.UserProfiles.ManagementAgent.ProfileImportExportExtension.CreateChangeData(ModificationType modificationType, String[] changedAttributes, CSEntry csentry)

   at aMicrosoft.Office.Server.UserProfiles.ManagementAgent.ProfileImportExportExtension.Microsoft.MetadirectoryServices.IMAExtensibleCallExport.ExportEntry(ModificationType modificationType, String[] changedAttributes, CSEntry csentry)

Additionally, when you capture a network capture, one may see traffic going via a proxy & the request fail due to Authentication. Here is a sample capture during Moss_export phase showing the request to MOSS Server being sent via a proxy

5        10:46:02 AM 11/27/2012        0.0197241        miiserver.exe        10.0.11.156        10.0.11.11        HTTP        HTTP:Request, POST http://MossServer:5725/ResourceManagementService/MEX         {HTTP:8, TCP:7, IPv4:6}

6        10:46:02 AM 11/27/2012        0.0200219        miiserver.exe        10.0.11.11        10.0.11.156        HTTP        HTTP:Response, HTTP/1.1, Status: Continue., URL: http://MossServer:5725/ResourceManagementService/MEX         {HTTP:8, TCP:7, IPv4:6}

7        10:46:02 AM 11/27/2012        0.0201059        miiserver.exe        10.0.11.156        10.0.11.11        WSTransfer        WSTransfer:Metadata Request Message        {SOAP:9, HTTP:8, TCP:7, IPv4:6}

8        10:46:02 AM 11/27/2012        0.0208326        miiserver.exe        10.0.11.11        10.0.11.156        HTTP        HTTP:Response, HTTP/1.1, Status: Proxy authentication required, URL: http://MossServer:5725/ResourceManagementService/MEX Using Multiple Authentication Methods, see frame details        {HTTP:8, TCP:7, IPv4:6}

Steps for resolution

=================

To resolve such issues, we need to identify how the web proxy has been configured in the environment & then either remove it or set bypass on various components involved. Possible ways to identify a proxy,

a) IE configuration

b) From command prompt run the command > Netsh Win http Show Proxy

c) Network capture (using tools like Netmon) from the SharePoint server.

d) Dump the Group policy & confirm if this setting is pushed via a Group policy

e) WPAD Proxy configuration pushed via a DHCP

For UPA One would need to configure bypass for proxy at various components which are involved for Setup, Configuration & working for UPA & they are

a) For Owstimer which provisions UPA at C:\Program Files\Common Files\Microsoft Shared\Web Server Extensions\14\BIN\OWSTIMER.EXE.CONFIG

b) Central Admin site at web.config file for CA Site

c) MiisServer which runs the FIM Sync at C:\Program Files\Microsoft Office Servers\14.0\Synchronization Service\Bin\miiserver.exe.config

d) Forefront Identity Manager service at C:\Program Files\Microsoft Office Servers\14.0\Service\Microsoft.ResourceManagement.Service.exe.config

Note: Change to respective \15 folders for SharePoint 2013

The following example adds three addresses to the bypass list. This is recommended to be done on all the configuration files listed above as described here

------------------------------------------------------------------
< configuration>
< system.net>
< defaultProxy>
< bypasslist>
< add address="[a-z]+\.contoso\.com" />
< add address="192\.168\..*" />
< add address="Netbios name of server" />
< /bypasslist>
< /defaultProxy>
< /system.net>
< /configuration>
-----------------------------------------------------------------

 

The first bypasses the proxy for all servers in the contoso.com domain;

The second bypasses the proxy for all servers whose IP addresses begin with 192.168.

The Third Bypass entry is for the ServerName

 

Note: After the Configuration files are updated, you would need to restart the respective service for the changes to take effect.

 

Post By : Rajan Kapoor[MSFT]

SharePoint 2010/2013: UPA Sync from Active Directory pulling accounts in Domain: username instead of Domain\username

$
0
0

 

Scenario

You have a User profile Service application configured in SharePoint 2010 /2013 environment with Active Directory Sync connection configured for Importing profiles. Once Imported the User profile as seen from "Manage User Profiles" link in User Profile Service application is seen as Domain\Username format by default.

There could be a situation when this show up in Domain: Username format instead of Domain\Username.

Note: For LDAP / FBA Claims scenario the profile or profile's Login name as shows   MembershipProvider:Loginname. This is by default.

Cause

This occurs if the ObjectSid (AD attribute) to SID (SharePoint Attribute) mapping is missing from MOSS management agent. Typically, this may occur if the mapping has been removed from SharePoint >>CA>>UPA>> Manage User Properties link or via SharePoint Management Shell.

This Mapping is created while we Provision the AD Sync Connection by the Rules Extension DLL, as seen in "Configure Attribute Flow" section of AD management Agent & is stored in the Profile Sync DB. Once removed, it is permanently deleted.

While this is removed from UI, There is no warning message shown to End user indicating that this will cause issues & should not be removed.

Resolution:

This Mapping CANNOT be added back to existing SharePoint attribute SID by either UI (you do not get that option) or via SharePoint Management Shell.

The way out in this Situation is to reset the Sync DB (http://technet.microsoft.com/en-us/library/ff681014.aspx#resetSync) & then you would need to re-create the Sync Connections again. A Sync DB reset would warrant "My Site Cleanup Job” to be disabled and 2 Full + 1 Increment Sync to done.

Therefore, it is recommended to note the following details before proceeding to reset the Synchronization database.

1. Sync Connection Details as

a. Selected OU Details

b. AccountName used to Query AD sources

2. Any User /Group connection filters specified on Sync Connection

3. Custom Property Mappings for Additional Attributes

4. Configuration Settings for any BCS connector etc, along with Attribute Mappings


 

Published by : Rajan Kapoor [MSFT]

SharePoint 2010: Search Service application - Old databases are not removed when we try to rename the Property database or Crawl database using central admin site for the Search Service application that was provisioned via PowerShell - Part 1

$
0
0

 

I was recently working on a SharePoint issue. The issue was that we were unable to rename the Property database or Crawl database in central admin site.

Here is the scenario:

We have provisioned a Search Service application using PowerShell. The script used by our customer to create the Search Service application was taken from a blog on internet. (Almost all blogs have same simple script to create Search Service application). The Search Service application is up and running fine without any issues.

The present requirement was to rename the Search Property and Crawl databases to match the organization standards.

For example:

SSA_Property_DB to SSA_Property_DB_renamed

SSA_Crawl_DB to SSA_Crawl_DB_renamed

We tried to rename the database through UI as follows:

· In Central Admin Site, go to the Search Service application. Under “Search Application Topology" click on Modify.

· Under Databases, click the Property database --> Edit Properties. Enter the new name in the Database Name field.

· Click OK followed by Apply Topology Changes.

· At this stage, it should show "Processing" in UI.

It takes at least 6 to 7 minutes to complete in UI. If you check the SQL server during this time, a new database with the specified name is created. The data from the original property DB is copied to the new database.

Now, the Issue:

After the task is completed, the old DB is supposed to be deleted. However in our case, the old database still remains. We had two Property databases in the Search Application Topology, old database as well as the new one. We are unable to delete it manually as well. We were able to reproduce this on my test environment too.

The issue reproduces every time when we use the PowerShell to create Search Service application. If we create the Search Service application using UI, we are able to successfully rename the database (The old DB is deleted)

Cause:

The problem is in the script we use to create the Search Service application. If we see any blogs on net for creating the Search Service application using PowerShell, the skeleton of script would be something like this:

1. We start the search services using Start-SPEnterpriseSearchServiceInstance and create a new application pool using new-SPServiceApplicationPool

2. Create the Search Application using New-SPEnterpriseSearchServiceApplication and a proxy using new-spenterprisesearchserviceapplicationproxy

3. Provision Admin component using set-SPEnterprisesearchadministrationcomponent

4. Create a new Crawl Topology using New-SPEnterpriseSearchCrawlTopology

5. Create a Crawl component using New-SPEnterpriseSearchCrawlComponent and activate the Crawl Topology.

6. Create a Query component using New-SPEnterpriseSearchQueryComponent and activate the query Topology.

In the step 5 and 6, we always add new crawl and query topology and activate them. We never delete the existing one (which is created when we create a new search service application). So, we end up having two query topologies and both have references to the property database. That is the reason why we face issue renaming the Property database. It will not be deleted. Same happens with crawl topology too.

If we compare the properties of the Search Service Application that were created via PowerShell and UI, we see the following difference:

clip_image002

Solution:

We need to modify the PowerShell script to create Search Service Application. To be specific, we need to remove the existing crawl/query topology after provisioning the Search Service application.

I have written an updated PowerShell script to achieve the same. It creates a Search Service Application in the similar way that UI creates. We can then successfully rename the property or crawl databases from UI.

I am posting the script here:

Disclaimer:

Sample Code is provided for the purpose of illustration only and is not intended to be used in a production environment. THIS SAMPLE CODE AND ANY RELATED INFORMATION ARE PROVIDED "AS IS" WITHOUT WARRANTY OF ANY KIND, EITHER EXPRESSED OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE IMPLIED WARRANTIES OF MERCHANTABILITY AND/OR FITNESS FOR A PARTICULAR PURPOSE. We grant you a nonexclusive, royalty-free right to use and modify the Sample Code and to reproduce and distribute the object code form of the Sample Code, provided that. You agree: (i) to not use Our name, logo, or trademarks to market Your software product in which the Sample Code is embedded; (ii) to include a valid copyright notice on Your software product in which the Sample Code is embedded; and (iii) to indemnify, hold harmless, and defend Us and Our suppliers from and against any claims or lawsuits, including attorneys’ fees, that arise or result from the use or distribution of the Sample Code.

##### *********************************************

# This is a sample script. It is provided as is. Please modify it and fine tune it as per your environment and business needs.

# In no way this is to be considered as a production ready script.

# Environment specific settings

$databaseServerName = "<DBservername>"

$searchServerName = "<Searchservermachine>"

$saAppPoolName = "<SSA_Appoolname>"

$appPoolUserName = "<domain\username>"

$searchSAName = "<SSA name>"

$DBName = "<SSA DB name>"

# Creating Application pool

$saAppPool = Get-SPServiceApplicationPool -Identity $saAppPoolName -EA 0

if($saAppPool -eq $null)

{

Write-Host " Creating Service Application Pool..."

$appPoolAccount = Get-SPManagedAccount -Identity $appPoolUserName -EA 0

if($appPoolAccount -eq $null)

{

Write-Host " Please supply the password for the Service Account..."

$appPoolCred = Get-Credential $appPoolUserName

$appPoolAccount = New-SPManagedAccount -Credential $appPoolCred -EA 0

}

$appPoolAccount = Get-SPManagedAccount -Identity $appPoolUserName -EA 0

if($appPoolAccount -eq $null)

{

Write-Host " Cannot create or find the managed account $appPoolUserName, please ensure the account exists."

Exit -1

}

New-SPServiceApplicationPool -Name $saAppPoolName -Account $appPoolAccount -EA 0 > $null

}

# Creating Search Service application

Write-Host " Creating Search Service and Proxy..."

Write-Host " Starting Services..."

Start-SPEnterpriseSearchServiceInstance $searchServerName

Start-SPEnterpriseSearchQueryAndSiteSettingsServiceInstance $searchServerName

Write-Host " Creating Search Application..."

$searchApp = New-SPEnterpriseSearchServiceApplication -Name $searchSAName -ApplicationPool $saAppPoolName -DatabaseServer $databaseServerName -DatabaseName $DBName

$searchInstance = Get-SPEnterpriseSearchServiceInstance $searchServerName

Write-Host " Creating Administration Component..."

$searchApp | Get-SPEnterpriseSearchAdministrationComponent | Set-SPEnterpriseSearchAdministrationComponent -SearchServiceInstance $searchInstance

# Crawl component

Write-Host " Creating Crawl Component..."

$InitialCrawlTopology = $searchApp | Get-SPEnterpriseSearchCrawlTopology -Active

$CrawlTopology = $searchApp | New-SPEnterpriseSearchCrawlTopology

$CrawlDatabase = ([array]($searchApp | Get-SPEnterpriseSearchCrawlDatabase))[0]

$CrawlComponent = New-SPEnterpriseSearchCrawlComponent -CrawlTopology $CrawlTopology -CrawlDatabase $CrawlDatabase -SearchServiceInstance $searchInstance

$CrawlTopology | Set-SPEnterpriseSearchCrawlTopology -Active

Write-Host -ForegroundColor white " Waiting for the old crawl topology to become inactive" -NoNewline

do {write-host -NoNewline .;Start-Sleep 6;} while ($InitialCrawlTopology.State -ne "Inactive")

$InitialCrawlTopology | Remove-SPEnterpriseSearchCrawlTopology -Confirm:$false

# Query component

Write-Host " Creating Query Component..."

$InitialQueryTopology = $searchApp | Get-SPEnterpriseSearchQueryTopology -Active

$QueryTopology = $searchApp | New-SPEnterpriseSearchQueryTopology -Partitions 1

$IndexPartition= (Get-SPEnterpriseSearchIndexPartition -QueryTopology $QueryTopology)

$QueryComponent = New-SPEnterpriseSearchQuerycomponent -QueryTopology $QueryTopology -IndexPartition $IndexPartition -SearchServiceInstance $searchInstance

$PropertyDatabase = ([array]($searchApp | Get-SPEnterpriseSearchPropertyDatabase))[0]

$IndexPartition | Set-SPEnterpriseSearchIndexPartition -PropertyDatabase $PropertyDatabase

$QueryTopology | Set-SPEnterpriseSearchQueryTopology -Active

Write-Host -ForegroundColor white " Waiting for the old query topology to become inactive" -NoNewline

do {write-host -NoNewline .;Start-Sleep 6;} while ($InitialQueryTopology.State -ne "Inactive")

$InitialQueryTopology | Remove-SPEnterpriseSearchQueryTopology -Confirm:$false

# Creating proxy

Write-Host " Creating Proxy..."

$searchAppProxy = New-SPEnterpriseSearchServiceApplicationProxy -Name "$searchSAName Proxy" -SearchApplication $searchSAName > $null

Write-Host "Search Service Application created successfully"

##### *********************************************

In my next post, I will include the script to fix the existing Search Service Applications (The SSAs which are already created using the problem script, whose Crawl and Property databases cannot be renamed)

Part 2   at :  http://blogs.msdn.com/b/spses/archive/2013/04/23/sharepoint-2010-search-service-application-old-databases-not-removed-when-we-try-to-rename-the-property-database-or-crawl-database-using-central-admin-site-for-the-search-service-application-that-was-provisioned-via-powershel.aspx

 

Author: Pradeep Martin Anchan [MSFT]


SharePoint 2010: Search Service application - Old databases not removed when we try to rename the Property database or Crawl database using central admin site for the Search Service application that was provisioned via PowerShell - Part 2

$
0
0

In my last blog , Part 1  I discussed the scenario where we had provisioned a Search Service application using PowerShell and how it causes issues while renaming the Search Property and Crawl databases. In this blog, I am including a script to fix the existing Search Service Applications.

If we have existing Search Service Applications that were created using the problem PowerShell Script, we will not be able to rename the Crawl and Property databases for these SSAs in the UI. So what if you need to rename them?

Well, solution is simple. We can write the PowerShell script to fix it. I am including a sample script to fix the existing Search Service Applications so that it will allow us to rename the Crawl and Property databases. This scripts loops and lists out all the Inactive search topologies. It then offers whether to delete them. If we choose “Yes”, it will delete all the Inactive search topologies.

Disclaimer:

Sample Code is provided for the purpose of illustration only and is not intended to be used in a production environment. THIS SAMPLE CODE AND ANY RELATED INFORMATION ARE PROVIDED "AS IS" WITHOUT WARRANTY OF ANY KIND, EITHER EXPRESSED OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE IMPLIED WARRANTIES OF MERCHANTABILITY AND/OR FITNESS FOR A PARTICULAR PURPOSE. We grant you a nonexclusive, royalty-free right to use and modify the Sample Code and to reproduce and distribute the object code form of the Sample Code, provided that. You agree: (i) to not use Our name, logo, or trademarks to market Your software product in which the Sample Code is embedded; (ii) to include a valid copyright notice on Your software product in which the Sample Code is embedded; and (iii) to indemnify, hold harmless, and defend Us and Our suppliers from and against any claims or lawsuits, including attorneys’ fees, that arise or result from the use or distribution of the Sample Code.

##### *********************************************

# This is a sample script. It is provided as is. Please modify it and fine tune it as per your environment and business needs.

# In no way this is to be considered as a production ready script.

# Environment specific settings

$SSAName = "<Name of existing SSA>"

## Initially we will list the inactive Crawl and Query topologies

$bFlag = 0;

Write-Host "`nQuery Topologies which are inactive:" -foregroundcolor green -backgroundcolor black

$QueryTopologies = Get-SPEnterpriseSearchQueryTopology -SearchApplication $SSAName

foreach($QueryTopology in $QueryTopologies)

{

if($QueryTopology.State -eq "Inactive")

{

Write-Host " Id: " $QueryTopology.Id ", Status: " $QueryTopology.State -foregroundcolor green -backgroundcolor black

Write-Host "`n"

$bFlag = 1;

}

}

Write-Host "Crawl Topologies which are inactive:" -foregroundcolor green -backgroundcolor black

$CrawlTopologies = Get-SPEnterpriseSearchCrawlTopology -SearchApplication $SSAName

foreach($CrawlTopology in $CrawlTopologies)

{

if($CrawlTopology.State -eq "inactive")

{

Write-Host " Id: " $CrawlTopology.Id ", Status: " $CrawlTopology.State -foregroundcolor green -backgroundcolor black

Write-Host "`n"

$bFlag = 1;

}

}

## Delete after getting confirmation.

if($bFlag -ne 1)

{

return;

}

$Delete = read-host -prompt "Do you want to delete all the crawl/Query Topologies which are inactive?`n (Y/N)"

if($Delete -eq "Y")

{

Write-Host "Deleting the Query Topologies which are inactive....." -foregroundcolor green -backgroundcolor black

foreach($QueryTopology in $QueryTopologies)

{

if($QueryTopology.State -eq "Inactive")

{

Write-Host " Deleting Query Topology --- Id: " $QueryTopology.Id -foregroundcolor green -backgroundcolor black

Write-Host "`n"

$QueryTopology.Delete();

}

}

Write-Host "Deleting the Crawl Topologies which are inactive....." -foregroundcolor green -backgroundcolor black

foreach($CrawlTopology in $CrawlTopologies)

{

if($CrawlTopology.State -eq "inactive")

{

Write-Host " Deleting Crawl Topology --- Id: " $CrawlTopology.Id -foregroundcolor green -backgroundcolor black

Write-Host "`n"

$CrawlTopology.Delete();

}

}

## We are done here.

Write-Host "Done." -foregroundcolor green -backgroundcolor black

return

}

if($Delete -eq "N")

{

Write-Host "You chose N" -foregroundcolor green -backgroundcolor black

return

}

Write-Host "You chose wrong option" -foregroundcolor green -backgroundcolor black

##### *********************************************

 

Author: Pradeep Martin Anchan [MSFT]

SharePoint 2010/2013: “Change Log “Timer Job is not cleaning up Expired entries in EventCache Table

$
0
0

In SharePoint every content database contains an EventCache table that is the “change log” for objects contained in the database. Each row in the table is depicting a change in an object. Columns in the table contain information such as the date and time of a change, the type of object that was changed, the nature of the change, and a unique identifier for the object.

Sharepoint has a “Change Log” Timer job for each web-application which is scheduled to run on a Weekly basis. This job removes expire entries from the Change log of the respective web-application.

Expiration of the change logs will happen based on the ChangeLogExpirationEnabled and ChangeLogRetentionPeriod properties of the web application. Timer job called “Immediate alerts” which then processes all the events in the change log and & send out alert to users who have subscribed alerts for. This timer job then marks the EventCache entry as processed and updates the last processed event details in EventBatches table

There are situations I have seen where the Event cache table is huge (millions of rows) & they are not being cleaned up as expected. Even when you detach the database from SharePoint & attach it back with –clearchangelog switch the EventCache table is not purged.

One of the reasons this happens is because the “Immediate Alert” timer job is disabled. This would lead to alerts being unprocessed & the Change Log Cleanup job will ignore the unprocessed entries.

One needs to Simply Enable Immediate alerts timer job web application by going to Central admin, Monitoring and Review job definitions page.

Note: Immediate alert job will process couple of 1000s of record at each run. If there are millions of records to be processed and cleaned up, then you may have to schedule the immediate alert and Change log timer jobs to run more frequently than the default schedules.

References:

Overview of the Change Log

Timer job reference

Mount-SPContentdatabase with Switch Clearchangelog


Post By : Gireesh Kumar G [MSFT]

How to reduce the size of logging database by purging old data from it.

$
0
0

WSS logging database grows very fast and it may cause  storage problem at times SharePoint server 2010.

More on Logging Database?

The Logging database stores the following data:-

· ULS Logs from 14 Hive\LOGS

· Raw Resource -Usage Data

By Default the system retains 14 days of Usage data in this database.

By following PowerShell command we can check and change the value of retention days of this database. ‘Get-spusagedefinition’ and ‘Set-SPUsageDefinition’

To find the Logging Database Name go to following path:-

Login to Central Administration -> Monitoring -> Configure Usage and health data collection->

clip_image002

How to purge old data from logging database

=>Please try below steps to purge old data from the logging database and reduce the size of database.

Step1: Run the powershell command: ‘Get-SPUsageDefinition’

clip_image004

Step 2: Now we need to find which table is taking most of the space inside the WSS logging Database.

You can check the same from the SQL Server

Login to Sharepoint Management Studio -> Select your logging Database (Right Click) -> Reports- > Standard Reports -> Disk Usage by Top Tables.

clip_image006

Say RequestUsage* is taking most of the space inside WSS Logging database so you can bring down the retention period of Page Request Event.

Or if you want you can bring down the retention period for all the Events for any number of days from 14 to 1.

Step 3: Use the following command to purge the old table data from this database.

Set-SPUsageDefinition -Identity"Page Requests" -DaysRetained 3

Run the same Power Shell command again to cross check "Get-SPUsageDefinition" and check the Retention period is changed or not.

clip_image008

Step 4: After that we need to run the two timer jobs to clean the old data 'Microsoft SharePoint Foundation Usage Data Import' and 'Microsoft SharePoint Foundation Usage Data Processing'.

Go to Sharepoint Central Administration -> Monitoring -> Configure Usage and health data collection-> Log Collection Schedule.

And it will take you to the timer jobs.

clip_image010

Click on both the Job Definitions one by one and hit 'Run Now' to run the timer jobs

clip_image012

Once the timer jobs gets completed we can check if database have released the space,

------------------------------------------------------------------------------------------------------------------------

Please Note :

Though there is no such performance impact of running this timer job on sharepoint server or on the SQL server but it is suggested please run it in off business hours.

Once the timer jobs is run you can check and confirm database has released the space.

 

POST  BY :Manhar Sharma [MSFT]


SharePoint 2013: Site in Read only mode after an interrupted backup

$
0
0

I came across an interesting issue recently where a customer was backing up a site collection via SharePoint Management shell & the backup process was abnormally terminated. This resulted in the Site being in Read only mode & the following message would show up when users browsed to the site.

“We apologize for any inconvenience, but we've made the site read only while we're making some improvements."

clip_image001

Go to "Site collection Quota & locks" in Central Admin & here is how the Status looks like. It is locked as “Read only" & all settings are grayed out.

clip_image003

Here is more on this behavior & how to get out of this situation.

In SharePoint 2013, we introduced a property MaintenanceMode for Spsite object which indicates the site is undergoing a Maintenance & is read only. SPSite.MaintenanceMode flag can be set on a site for several reasons like content database is in read only state, or site collection is being upgraded, backed up or moved.

If a site gets into a state where the action that set this has terminated in a way where this is still set, we run into this situation.

The way to clear this flag is use the ClearMaintenanceMode method in SpSiteAdministration object. Here is how it can be done via SharePoint Management Shell

------------------------------------------------

# $Admin = new-object Microsoft.SharePoint.Administration.SPSiteAdministration('http://weburl/sites/sitecollectionurl’)

$Admin.ClearMaintenanceMode()

-------------------------------------------------

The SpsiteAdministration.ClearMaintenanceMode method was introduced in April 2013 CU for SharePoint 2013. So you would need to upgrade the environment to March 2013 & April 2013 update available at Update Center for SharePoint 2013

 

References :

SPSite.MaintenanceMode property

SpSiteAdminstration Class


Blog By : Rajan Kapoor [MSFT]

 

UPA Sync Service Startup issues due to invalid Name of the server

$
0
0

Recently I had  worked on couple of an issues where User Profile Synchronization service wasn’t getting started. The issue was specific to one server. Even if we create new UPA then also User Profile Synchronization won’t get started. ULS logs don’t show any exceptions. They only show that the Profile Synchronization Setup Job runs and exits without any error and FIM synchronization service does never get started . After doing some research and testing some scenarios I figured it out that the User Profile Synchronization Service does not start in the following conditions

    • SharePoint server has been renamed using the Rename-SPServer Power Shell command to another name

    • The server name is in the form of an IP address (like xx.xx.xx.xx)

    • There is difference in the NetBIOS name of the server and server name listed in servers in farm on which we are trying to start the User profile synchronization service.

      When the Profile Synchronization Setup timer job runs, it checks if it is the local machine by comparing the server name string with Environment.LocalMachine. The Environment.LocalMachine picks up the value from the registry. There is chance when you run command hostname from command prompt and run the below Power shell command you might see a difference also.

      [System. Environment]::Machine Name

      If the NetBIOS name of the machine and the SharePoint server name do not match, the setup is rejected and the User Profile synchronization service will quit without any errors. It is highly recommended to use the NetBIOS name of the server and avoid using FQDN (or IP address) as the server name while configuring a SharePoint farm.

      Rudra Roy [MSFT]

      All you want to know about People Picker in SharePoint ( Functionality | Configuration | Troubleshooting ) Part-2

      $
      0
      0

      The Part 1 of this series talks about how People Picker works under the default configuration . In this post I will focus on what changes when we have SharePoint servers in the farm & principals ( users or groups ) belong to Different Active directory Domains & Forests .

      Review & evaluate your current AD infrastructure to determine the existing trust relationships as this information would be critical to determine the behaviour of People Picker & configuration required . If you need assistance to figure out the type of Trust or Transitivity , please refer to TechNet articles in References section

       

      Querying Additional Forest or Domains

       

      Scenario 1 : The SharePoint server belongs to a forest and/or domain that has 2-way trusts established with other forests or domain

       

      The accounts of the users accessing SharePoint therefore belong to those trusted domains and since the trust is 2 way, the identity of the IIS application pool is also capable of authenticating against those trusted domains. Hence all domains and forests with a two-way trust relationship are available without any additional configuration steps as People Picker searches all two-way trusted forests and all two-way trusted domains by default .

      Note : If there is a Forest wide Trust between two forests & users and/or resources belong to Child domains , still no configuration is required at the SharePoint level as Forest Trust creates a transitive trust relationship between every domain in both forests.

       

      Scenario 2 : The SharePoint server belongs to a forest and/or domain that has 1-way trusts established with other forests or domain

       

      In order to instruct SharePoint to query those trusted domain or forests we need to configure access to them by using the Stsadm command-line tool and selecting an account to use when accessing each forest or domain. Those credentials must be from the forest/domain to be queried or from a trusted domain, as long as it is allowed to authenticate and is not denied to logon remotely.

      You can use a different account for each target domain or forest, or the same account for all domains and forests.

       

      1. On every front-end Web server on a farm, at a command prompt, type the following command, and then press ENTER:

      STSADM.exe -o setapppassword -password key

      Note: The account and password used to access each domain or forest is stored on each front-end Web server in the farm . This key is an encryption string that is used to encrypt the password for the account that is used to access the forest or domain . This encryption string must be the same for all servers in the farm, and unique for each server farm in a deployment with multiple farms.

       

      2. On a front-end Web server, at a command prompt, type the following command, and then press ENTER,

      STSADM.exe -o setproperty-propertyname peoplepicker-searchadforests -propertyvalue <Valid list of forests or domains> -url <URL of the Web application>

      Note : You have to provide a valid list of forest/domains to be queried as well as the credentials to do so, this can be specified as below .

      forest:DnsName,LoginName,Password

      domain:DnsName,LoginName,Password

      You can add multiple forests and or domains by listing multiple domains or forests in the format Forest/Domain:DnsName,LoginName,Password separated by semicolons. For e.g.

       

      STSADM.exe -o setproperty-propertyname peoplepicker-searchadforests -propertyvalue“forest:Contoso.com,Contoso\User1,PasswordofUser1; domain:Fabrikam.com,Fabrikam\User2,PasswordofUser2” -url http://webapp

       

      Note : You can omit the username and password if the application pool identity already has access to Target domain or forest, this would be the case where application pool identity is from the Trusted forest or domain itself .

       

      Special Case : If the application pool identity is from the Target forest or Domain you would be able to resolve all users from target domain & forest but unable to resolve users from local SharePoint domain using the Browse option ( Checkname will still work **) & would require to specify the local Domain or Forest via the STSADM Command ,although without a username & password .

       

      Key Observations & Considerations :

      1. In the above example notice how each block is separated by semi-colons and each element in a block is separated by comma.
      2. There is no space between peoplepicker and -searchadforests .
      3. Ensure that you have valid credentials for each forest/domain & passwords do not contain commas.
      4. If you need to query a forest the “forest” argument is specified, a forest-trust must be in place and/or if you need to query a domain the "domain" argument is specified , an external trust will suffice .
      5. If you configure a forest to be queried, it is not necessary to declare all or some child domains separately, they will be queried anyway .
      6. If “forest” is specified, the Global Catalog Service will be used to perform the query, if “Domain” is specified, the LDAP service will be used instead.
      7. The DC Locator process used as discussed in Default configuration is still applicable BUT is extended beyond the local forest boundaries. This means that the SharePoint servers must be able to resolve names of remote forests/domains domain controllers (SRV records and A records) , hence DNS must be correctly configured .
      8. Once you have configured peoplepicker for a web app & If you still wish to query the forest/domain SharePoint belongs to, you’ll have to add it as part of the parameter too.
      9. If you use “Selective authentication” or “SID Filtering” in order to restrict authentication through trusts, you must make sure that the IIS application pool identity is allowed to authenticate against the remote forest/domain it queries.
      10. If you add a cross Forest /Domain user to SharePoint Site there is an entry for the same in the Userinfo table for that web application . If later the trust is removed , you my still be able to Checkname that user as it already exists in SQL but cannot add it to the site & assign permissions etc .

       

      To Remove the People Picker entry for a web Application , Run the STSADM command with blank property value i.e. Start & End double quotes with no space .

      STSADM.exe -o setproperty-propertyname peoplepicker-searchadforests -propertyvalue“” -url http://webapp

       

      Here is flow Chart which depicts the process flow of configuring People Picker for cross forest scenarios .

       

              clip_image001

       

      **Note: The people picker settings added by Stsadm peoplepicker-searchadforests command are honored by Browse feature while the checkname still would query all domains local & trusted & would be able to enumerate users provided the application pool identity has access to it . Once you add the People picker entry the ability to query local share point Domain /Forest is lost & that too needs to be added in the list of peoplepicker-searchadforests command.

       

      Refrences:

      TRUST TYPES

      FOREST TRUSTS

      Peoplepicker-searchadforests: Stsadm property

      Select users from multiple forest domains

      How to configure Active Directory to allow anonymous queries

       

      Post By: Rajan Kapoor [MSFT] 

      Blob Cache in SharePoint

      $
      0
      0

       

      1.What is Blob Cache?

      Disk-based Caching for Binary Large Objects

      Disk-based caching controls caching for binary large objects (BLOBs) such as image, sound, and video files, as well as code fragments. Disk-based caching is extremely fast and eliminates the need for database round trips. BLOBs are retrieved from the database once and stored on the Web client. Further requests are served from the cache and trimmed based on security.

      2.Enabling and modifying disk-based caching for SharePoint sites.

      Disk-based caching is disabled by default. To enable and customize the disk-based cache, you must modify the following statement in the web.config file for the SharePoint Web application.

      By default, it looks like this:
      <BlobCache location="C:\blobCache" path="\.(gif|jpg|png|css|js)$" maxSize="10" enabled="false" />

      In order to improve the performance of your site, the BlobCache should be enabled.
      <BlobCache location="C:\blobCache" path="\.(gif|jpg|png|css|js)$" maxSize="10" enabled="true" />

      Example:

      <BlobCache location="C:\blobCache" path="\.(gif|jpg|png|css|js)$" maxSize="10" max-age="86400" enabled="false"/>

      In the preceding example:

      • location is the directory where the cached files will be stored
      • path specifies in the form of a regular expression which files are cached based on the file extension
      • maxSize is the maximum allowable size of the disk-based cache in gigabytes
      • max-age specifies the maximum amount of time in seconds that the client browser caches BLOBs downloaded to the client computer. If the downloaded items have not expired since the last download, the same items are not re-requested when the page is requested. The max-age attribute is set by default to 86400 seconds (that is, 24 hours), but it can be set to a time period of 0 or greater.
      • enabled is a Boolean that disables or enables the cache.

      3. Flushing the Disk-based Cache:

      We can flush the current site collection object cache. To do this, browse to the following location on the web site,
      Site collection administration -> site collection object cache -> disk based cache reset.

      If we have multiple WFEs in the farm, each WFE will maintain its own copy of Disk-based Cache. SharePoint does not have a Web user interface (UI) to flush the disk-based cache on all the servers in a farm and neither is there an option to select a specific WFE.

      4.The option provided in the Administration page for flushing the cache is only to the flush the cache on the web front end to which you are currently browsing.

      if you would like to flush the complete binary large object (BLOB) caches associated with a specific Web Application on different Web front-end computers in a farm, then you can use the STSADM command to do so.

      STSADM -o setproperty -propertyname blobcacheflushcount -propertyvalue 11 -url http://mywebapp:port

      5.Internals of the blob cache functionality within SharePoint:

      After enabling blob cache in web.config, do an IISRESET and browse to the /settings.aspx first, instead of home page (Collaboration portal site).

      When we browse to the settings.aspx, it will create the following files change.bin, dump.bin and flushcount.bin (all the files will be 1KB in size).

      Browse to the home page now, it will create a folder PUBLISHINGIMAGES, all the images rendered from the database will be stored here and the above bin files will also get updated (we can see the difference in file size).

      Every time when a new image file is rendered from the database, the image folder gets a copy of the image. The bin files (index) will get updated with an additional entry for the new file.

      Every request from client will check the index file first. It will only check the cache index and not the image folder directly for the image. If the index file does not have an entry for the image, then the request is served from the database. During this time, a copy of the image will be stored in the images folder and the index file also gets updated, so that the next request will not go to the database.

      6.What happens I manually delete just the image file in the cache folder or the image file in the cache folder gets corrupted?

      We will not see the image on the client page; we get a broken image ‘X’.

      7.How are these cache files stored?

      Filename.extension.cache is the file naming format.

      8.What can I do if the cached file gets corrupted?

      Recommendation: We can reset the index file. Site settings -> Site collection object cache -> Check the box for “Force this server to reset its disk based cache”. It will completely delete all the images and reset the bin files back to 1KB, so that next request will go to database and the complete index will get rebuilt.

      Workaround: If you do not want to reset the complete index. Find out the missing image and copy just that file to the cache folder, name it as “filename.extension.cache”
      Refresh the page now; it should pick up the image.

      9.How is the Index file maintained?

      Every time when the web.config initializes, the index file is loaded from the blobcache folder to memory. All new entries will keep getting updated in index file when in memory.
      IISRESET /stop will flush the updated index file from memory to disc within the blobcache folder.
      IISRESET /start will load the index file from blobcache folder to memory.
      As long as the application pool is alive the index file will be getting updated in memory itself.

      10.Can an Index file get corrupt?

      Index file corruption is possible only when IIS crash or the index file is overwritten with wrong information. When IIS tries to load the index file from blobcache to memory and identifies that it is a corrupted file (not a valid file), it will get completely rebuilt (flush) as fresh file (1KB), all the old entries and images will be lost.

      11.In a farm environment, is there any way of having all the WFEs' blob caches synchronized among each other?

      No, this cannot be done because SharePoint maintains the index and the cache files individually on each server.

      12.In a single WFE environment using web gardening, how does blob caching function? Is there any sync that happens among the working processes?

      Web gardening is not supported.

      13.What would be the best way of using blob cache in a farm environment (best practices)?


      If the requirement is to “centralize” static cached files, then 3rd party content distribution network (CDN) solutions like Akamai need to be used.

      What are the limitations of SharePoint and blob caching in a farm environment?
      a. Blob caching does not work with Web Gardening.
      b. Blob caching does not synchronize data across WFEs – so we might be seeing different versions of the files for a short duration across different WFE servers.

      Ported from http://blogs.msdn.com/b/selvagan/archive/2008/12/11/blobcache-moss.aspx

      Post By: Selvakumar Ganapathi [MSFT]


      CRL verification Failure –SharePoint

      $
      0
      0

       

      Couple of weeks back, I had a chance to work with one of the customer who reported multiple issues in his farm. These issues were followed by a farm wide outage and he ended up contacting Microsoft. I got involved with this customer as part of the escalation process and following were his major concerns after the initial discussion I had with him.

      1. Site slowness after every IIS Reset. The sites takes minutes to load this is been the issues from the day they setup this production farm.

      2. On couple of servers in the farm the configuration cache is not getting refreshed and the OWSTimer is not working correctly. After clearing the configuration cache, if restarts the timer-service none of the files are created back by the timer.

      3. Because of the issues with the configuration cache, they were forced remove one of the server from the NLB (Network Load Balancer). Unfortunately one server left in the NLB could not handle the user load during the business hours and this ended up with a farm wide outage.

      Our primary objective here was to bring back the server which was moved out of NLB in working condition. We started looking at the timer job issue on the broken server. As always our starting point to dig further in to this issue was ULS logs. We enabled verbose logging and started looking at the ULS logs. Unfortunately the ULS logs were not helpful as the tracing service was not able to log much about the OWSTIMER as it was in stuck state. Basically nothing other than the following were getting logged in the ULS.

      OWSTIMER.EXE (0x08E0)        0x06A4        ULS Logging        Unified Logging Service        8pbc        Verbose WatsonRC is currently unavailable. Error Reporting will still work properly.

      OWSTIMER.EXE (0x08E0)        0x06A4        ULS Logging        Unified Logging Service        8wsv        High        ULS Init Completed (OWSTIMER.EXE, ONETNA~1.DLL)

      OWSTIMER.EXE (0x08E0)        0x06A4        Windows SharePoint Services        Timer        6gey        Verbose        Entered VtimerStore::HrInitTimer       

      OWSTIMER.EXE (0x08E0)        0x06A4        Windows SharePoint Services        Timer        6gfa        Verbose        Entered VtimerStore::HrGetISPNativeConfigurationProvider       

      OWSTIMER.EXE (0x08E0)        0x0C6C        ULS Logging        Unified Logging Service        7a8f        Verbose        Baseboard Manufacturer: 'Intel Corporation'

      We tried to stop the OWSTIMER from service manager and we get the following error after a long wait period. So we had to always kill the respective process from the task manager.

      clip_image001

      We went back and looked at the configuration cache folder and found that couples of XML files are created in the last 30-45 minutes. This was interesting because it kind of proved us that files are getting created but either they are very slow or there could be a possibility that the some of the objects which in the configuration database (Objects Table) is not in good shape.

      Since nothing from the ULS log was helping us to isolate between the object corruption and slowness, we decided to get a memory dump of the OWSTIMER.EXE as soon as we start it on the non-working server. I got couple of memory dumps and we ended up finding the following from the analysis:

      ntdll!ZwWaitForSingleObject+0xa

      kernel32!WaitForSingleObjectEx+0x9c

      cryptnet!CryptRetrieveObjectByUrlWithTimeout+0x263

      cryptnet!CryptRetrieveObjectByUrlW+0x20c

      crypt32!ChainRetrieveObjectByUrlW+0x81

      crypt32!CCertChainEngine::RetrieveCrossCertUrl+0x135

      crypt32!CCertChainEngine::UpdateCrossCerts+0x196

      crypt32!CCertChainEngine::Resync+0x1e8

      crypt32!CCertChainEngine::CreateChainContextFromPathGraph+0x239

      crypt32!CCertChainEngine::GetChainContext+0x8b

      crypt32!CertGetCertificateChain+0x100

      Most of the threads in OWSTIMER were checking the certificate revocation lists (CRLs) to verify that the signature is still valid. The problem here when loading signed assemblies, the .Net Framework checks the Internet based Certificate Revocation List to validate the signatures of the DLL that’s loaded. This kind of explained why they might be facing a slow page load after every IIS Reset. We confirmed this is the same root cause for the page load issue as well by getting a memory dump of w3wp process right after an IIS Reset.

      So, what is going on here? 

      The problem is that when loading signed assemblies the .net Framework checks the Internet based Certificate Revocation List. As our servers have, like most secure environments,  no outgoing connections to the public Internet, the connection to crl.microsoft.com times out after what appears to be 30 seconds. It probably does this a couple of times in succession, causing a delay in couple of minutes & we wait when spinning up SharePoint.

      After the timeout the assembly is still loaded and the software works as expected, though very slow every time a new signed assembly is loaded for the first time, which happens a lot. The worst thing is that no entries are written to the event log and no exceptions are thrown so you are left completely in the dark about why your application is so slow.

      The Workaround here is this configuration setting to disable signature verification in a .NET Framework 2.0 managed application. You can use this configuration setting in an application configuration file. To do this, add the following code to the <ApplicationName>.exe.config file for the .NET Framework 2.0 managed application,

      For the timer service, we created a configuration file “owstimer.exe.config" with in bin folder and disabled the generatePublisherEvidence .

      <configuration>
              <runtime>
                      <generatePublisherEvidence enabled="false"/>
              </runtime>
      </configuration>

      We restarted timer service and the configuration cache building completed in few seconds. The Sites started loading in few seconds after every IISReset. For more details about the Fix , please refer to http://support.microsoft.com/kb/936707/en-us .

      For the site slowness after IISReset, we disabled CRL check on all the servers at ASP.net Level using same configuration in “aspnet.config" file.

      Quick Ways to find out that CRL check failure are

      1. Collect a Fiddler trace on the Server and you would see the requests going to Crl.microsoft.com but no response is obtained.

      2. Collect a Netmon Trace on the Server to see either the Name resolution failing for Crl.microsoft.com or no response for request to Crl.microsoft.com .

      Here is a screenshot of a Fiddler Capture on SharePoint server which does not have access to internet & attempt was made to run SharePoint Management Shell, Stsadm Command & Launch Central Admin from Start Menu .

      clip_image003

      Here is a sample network captures taken on a SharePoint sever which is trying to access clr.microsoft.com site to check the CRL. To do so we try resolving the names crl.microsoft.com and the name resolution looks good.

      312 16:35:59 13-08-2012 10.30.30.154 SERVER1 <00> DNS DNS:QueryId = 0xA724, QUERY (Standard query), Query for crl.microsoft.com of type Host Addr on class Internet

      315 16:35:59 13-08-2012 SERVER1 <00> 10.30.30.154 DNS DNS:QueryId = 0xA724, QUERY (Standard query), Response - Success, 63.84.95.48, 63.84.95.73 ...

       

      We then try to establish a TCP session on port 80 with the IP address 63.84.95.48 and we fail to establish the session.

      317 16:35:59 13-08-2012 Owstimer.exe 10.30.30.154 63.84.95.48 TCP TCP:Flags=......S., SrcPort=53053, DstPort=HTTP(80), PayloadLen=0, Seq=339758504, Ack=0, Win=8192 ( ) = 8192

      554 16:36:02 13-08-2012 TmProxy.exe 10.30.30.154 63.84.95.48 TCP TCP:[SynReTransmit #317]Flags=......S., SrcPort=53053, DstPort=HTTP(80), PayloadLen=0, Seq=339758504, Ack=0, Win=8192 ( ) = 8192

      685 16:36:06 13-08-2012 TmProxy.exe 63.84.95.48 10.30.30.154 TCP TCP:Flags=...A.R.., SrcPort=HTTP(80), DstPort=53053, PayloadLen=0, Seq=0, Ack=339758505, Win=8192

       

      Post By :Sojesh Sreelayam [MSFT]

      SharePoint 2010: Issues with Service Application Proxy & Proxy Group Associations

      $
      0
      0

       

       

      At times we would see errors while accessing various Service applications. These types of issues manifest with various symptoms at different locations, here is an example of a few I have come across.

      Symptom 1:

      When you try to access the Managed Metadata Service Application from manage Service Applications, you face the following error:

      "The Service Application being requested does not have a connection associated with the Central Administration web application. To access the term management tool use the site settings from a site configured with the appropriate connection"

      Symptom 2:

      While trying to add the BCS catalog at the External List on a site or trying to add the BCS catalog while creating a sync connection for a BCS source in User profile Service application, we get the following error,

      "The Business data connectivity metadata store is unavailable. Check configuration and try again"

      Symptom 3:

      While BCS sync step is run during an UPA profile sync, we might see this error in ULS logs

      05/18/2012 01:00:39.14 miiserver.exe (0x0C1C) 0x23C8 SharePoint Portal Server User Profiles e96j High Error calling FindSpecific : Microsoft.BusinessData.Infrastructure.BdcException: The shim execution failed unexpectedly - Unable to obtain the application proxy for the context.. ---> Microsoft.Office.SecureStoreService.Server.SecureStoreServiceException: Unable to obtain the application proxy for the context.

      at Microsoft.Office.SecureStoreService.Server.SecureStoreProvider.get_Proxy()

      at Microsoft.Office.SecureStoreService.Server.SecureStoreProvider.GetRestrictedCredentials(String appId)

      at Microsoft.SharePoint.BusinessData.Infrastructure.WindowsAuthenticator.ExecuteAfterLogonUser(Object[] args, ISecureStoreProvider ssoProvider, String ssoApplicationId, Boolean useSensitiveSsoCreds)

      at Microsoft.SharePoint.BusinessData.Infrastructure.WindowsAuthenticator.ExecuteAfterLogonUser(Object[] args, ISecureStoreProvider ssoProvider, String ssoApplicationId)

      at Microsoft.SharePoint.BusinessData.SystemSpecific.Db.DbConnectionManager.GetConnection()

      Possible causes:

      1. You might be missing an Association between web-application & service applications via the Service applications Proxy Group.

      2. You might have in-sufficient permissions to Service applications/Service application proxies for various service accounts .

      Resolution:

      1. Create a new proxy group with all required all Service applications & then move the required web-applications to use this Proxy Group.

      2.  Check & assign the permissions to Service applications/Service application proxies for required service accounts.

       

      Here is a Sample commands to create a Proxy Group via SharePoint Management shell.Note : Although this can be done via UI , but with a Limitation of unable to create name of the proxy group .

      ============================================================

      $GroupName = 'New Custom Service Group'

      Remove-SPServiceApplicationProxyGroup -Identity $groupName -Confirm:$false -ErrorAction SilentlyContinue

      $newProxyGroup = New-SPServiceApplicationProxyGroup -name $GroupName

      $prxy=Get-SPServiceApplicationProxy

      Add-SPServiceApplicationProxyGroupMember -identity $newProxyGroup -member $Prxy

      =============================================================

      Post By: Rajan Kapoor [MSFT]

      Disclaimer
      By using the following materials or sample code you agree to be bound by the license terms below and the Microsoft Partner Program Agreement the terms of which are incorporated herein by this reference. These license terms are an agreement between Microsoft Corporation (or, if applicable based on where you are located, one of its affiliates) and you. Any materials (other than sample code) we provide to you are for your internal use only. Any sample code is provided for the purpose of illustration only and is not intended to be used in a production environment. We grant you a nonexclusive, royalty-free right to use and modify the sample code and to reproduce and distribute the object code form of the sample code, provided that you agree:  to not use Microsoft's name, logo, or trademarks to market your software product in which the sample code is embedded; (ii) to include a valid copyright notice on your software product in which the sample code is embedded; (iii) to provide on behalf of and for the benefit of your subcontractors a disclaimer of warranties, exclusion of liability for indirect and consequential damages and a reasonable limitation of liability; and (iv) to indemnify, hold harmless, and defend Microsoft, its affiliates and suppliers from and against any third party claims or lawsuits, including attorneys' fees, that arise or result from the use or distribution of the sample.

      SharePoint 2007: SSP upgrade issue with SP3

      $
      0
      0

       

      The Office Server Search gatherer crawled documents in batches. It creates a batch ID each time it loads documents from the crawl queue into SQL, these BatchIDs are stored in MSSBatchHistory Table (part of SSP Search DB). There was a limitation that existed in Search where in it does not clean up the MSSBatchHistory table upon crawl completion. As a result, MSSBatchHistory table continues to grow till one day BatchId (as defined as Identity type column) reaches the limit of Int described here , which would cause a failure in Search.

      In-order to overcome this situation, Microsoft released a fix to

      a) Change the BatchID column type from Int to a BigInt

      b) Delete rows from MSSBatchHistory at end of each crawl

       

      This fix was released in April 2011 CU (12.0.6557.5001) as described in the following KB articles,

      For Windows SharePoint Services 3.0: KB2512780

      For Office SharePoint Server 2007: KB2512781

      So what's the Problem?

      If you are upgrading to the April 2011 CU or Service Pack 3 for WSS 3.0/SharePoint 2007 coming from an older build (say Pre-April 2011 CU), this fix would get applied & as a part of the upgrade it will change the BatchID column from Int to BigInt.

      This process can take a lot of time (many hours) depending upon the number of records in the MSSBatchHistory table & will also grow the transaction log for SSP Search database considerably.

      Hence it is recommended that you plan this upgrade and monitor it to completion,

      Steps to monitor & plan for the upgrade:

      1. You can run the following SQL command on the Content DB to look at the Size & number of rows in the table & the output will look like this.

      Sp_spaceUsed 'MSSBatchHistory'

      Name

      Rows

      reserved

      data

      index_size

      unused

      MSSBatchHistory

      262021461

      6807152 KB

      6805776 KB

      112 KB

      1264 KB

      2. This is what the upgrade logs generated during the PSConfig execution will show,

      [SearchDatabaseSequence] [DEBUG] [12/18/2011 12:11:53 PM]: Calling get SchemaVersion on Database BPA_SSP_Search, Status = Upgrading.

      [SPManager] [DEBUG] [12/18/2011 12:11:53 PM]: [SearchSharedDatabase Name=SSP Name Parent=SPDatabaseServiceInstance] Running 1 of 1 steps

      [SearchQFE21038DatabaseAction] [12.2.508.0] [DEBUG] [12/18/2011 12:11:53 PM]: Begin Initialize()

      [SearchQFE21038DatabaseAction] [12.2.508.0] [DEBUG] [12/18/2011 12:11:53 PM]: End Initialize()

      [SearchQFE21038DatabaseAction] [12.2.508.0] [DEBUG] [12/18/2011 12:11:53 PM]: SearchQFE21038DatabaseAction.ShouldRun returns true

      [SearchQFE21038DatabaseAction] [12.2.508.0] [DEBUG] [12/18/2011 12:11:53 PM]: Begin Upgrade()

      [SearchQFE21038DatabaseAction] [12.2.508.0] [DEBUG] [12/18/2011 12:11:53 PM]: SearchQFE21038DatabaseAction.Upgrade: Changing MSSBatchHistory column BatchID type to bigint on database SQLServrName:SSP_searchDBName

      3. Depending on the size of the 'MSSBatchHistory' table and number of rows, plan sufficient downtime.

      4. Ensure the SSP Search DB is in Simple Recovery mode. (This is default setting)

      5. It is recommended to have sufficient amount of disk space on the drive having the transaction log for the SSP Search DB. You may want to move this to a drive with more space if required before the upgrade.

      6. It is recommended to set the MAXDOP =1 on SQL server before upgrade, as suggested here http://blogs.msdn.com/b/rmeure/archive/2011/12/23/optimizing-sql-for-sharepoint.aspx

      References:

      Estimate how long the upgrade process will take and the amount of space needed (Office SharePoint Server)

      http://technet.microsoft.com/en-us/library/cc262891(office.12).aspx

       

      Post By:Rajan Kapoor [MSFT]

      Office 365-Configure Hybrid Search with Directory Synchronization –Password Sync

      $
      0
      0

      One of the key features in SharePoint 2013 release is to provide our customers a way to move to the cloud on their own terms. A key requirement of today’s IT industry is cloud and on-premise solutions must co-exist. This is where hybrid scenarios come into existence. A hybrid environment allows organizations to retain the on-premise SharePoint Server environment they have and plan a phased transition of some workloads to the cloud. The new features in SharePoint 2013 make it possible to connect some services running in both on-premise SharePoint as well as SharePoint Online in order to create an application that spans across cloud and on-premise.

      A hybrid SharePoint environment is composed of SharePoint Server, deployed on-premise and Microsoft Office 365 – SharePoint Online. With Hybrid infrastructure in place you can then configure any of the workloads like Search, Duet, Business Connectivity services. Hybrid Search allows SharePoint Search to include search results from a Remote SharePoint Server. SharePoint Server 2013 On-Premise can display search results from SharePoint Online and vice versa. Hybrid Search does not return results by crawling remote SharePoint but by the new feature called Result sources and Query rules in SharePoint 2013.

      There are lot of content available on why we need Hybrid, on authentication flow, and how components interact with each other and below are some references that you can take a look.

      Hybrid for SharePoint Server 2013

      Architecture Design Recommendation for SharePoint 2013 Hybrid Search Features

      This blog post series will focus on how to configure Hybrid search environment and take a phased approach of achieving a two way Hybrid setup. In my recent tests I have validated that we should be able to set up Hybrid Search with Password Sync option available with the recent release of DirSync and eventually take a call on whether you would need to deploy ADFS 2.0 for single sign on. This will be a three part post:-

      Part 1: How to configure One Way Hybrid search with Password Sync.

      Part 2: How to Configure Two Way Hybrid search with Password Sync.

      Part 3: How to configure two way hybrid search with Single Sign on.

      Let’s first take a quick look at what is Hybrid and why would we even need it.

      Before even we dive in to the actual configuration let’s talk about an organization that is planning to embrace Office365 who already has a SharePoint Onpremise. As I mentioned above cloud and on-premise solutions will always co-exist. As IT department will start migrating some of the workloads to cloud a key challenge that people will face during the migration phase with SharePoint is to know where the content lives . That’s exactly when people will start implementing Hybrid to ensure that users will be able to search for content despite of the where the content lives .i.e. on-premise or online.

      At the time of writing this post below are the supported Hybrid scenarios. For updated list of supported scenarios it’s recommended to visit SharePoint Online Service descriptions.

       

      Scenario

      Works Out of Box

      SharePoint: Search

      Yes

      SharePoint: BCS

      Yes

      SharePoint: Duet

      Yes

      SharePoint 2013 –eDiscovery

      Not Supported

      SharePoint site Mailbox

      Not Supported

      SharePoint 2013 Social

      Not Supported

      Hybrid Search Deployment Options

      Below are the possible combinations that customer can deploy Hybrid

      Outbound Search (most common)

      Outbound from customers network i.e. SharePoint On-premises to SharePoint Online. User that is in the customers network, on corpnet, searches from On-premises. There is an outbound request to SharePoint Online to return results. Results from both are shown

      Inbound Search

      Inbound from SharePoint Online to customer’s network i.e. SharePoint On-premises. User that is not on customers network, but signed into SPO, searches. There is an inbound request to customers network i.e. SharePoint On-premises to return results. Results from both are shown

      Two-way Search

      Search is setup both inbound and outbound as described above. Both scenarios are supported in that case – whether user is On-premises on corpnet, or only signed in to SharePoint Online.

      Tip: Start small with outbound search first. Then as needed, add inbound search

      Components That Are Required To Deploy Hybrid

      Below is a pictorial representation of the components required in a two way Hybrid.

      image

       

      • On-Premise SharePoint 2013 Farm-Install or Upgrade
      • Office 365 Enterprise, E1 (Search only),E3 or E4 plans
      • Directory Synchronization with or without password sync
      • Single Sign On
      • Server To Server Authentication
      • Reverse Proxy

      So let’s first look at how IT team will implement authentication and authorization for its employees as they deploy Hybrid . One of the key requirements for Hybrid search to work is the unique user identity of the person trying to get results across Onpremise and SharePoint Online . Core requirement of the organization as it uses an on-premises directory service, is to integrate it with Windows Azure AD tenant in Office 365 and synchronize the users to cloud. Directory synchronization tool a 64 bit FIM based utility is used to synchronize on-premises directory objects (users, groups, contacts) to the cloud .Directory synchronization is also termed as directory sync or DirSync. You can read more about Directory Sync and its features here. DirSync with its recent release of Password Synchronization has a significant role in user sign in experience. If an organization wants to enable users to log into Office 365 Sharepoint Online using the same username and password as they use to log in to corporate network DirSync with Password Sync can be implemented. The primary benefit for users is that they do not need to remember two sets of credentials which they would need without this feature implemented. But this has a greater role to play in the world of Hybrid and users seeing search results. When a user types a keyword in sharepoint Onpremise and the results are intended to be returned from Online, SharePoint Online needs to rehydrate the users identity for it to return search results . A quick overview on what happens. User A submits a query On-premise .The query gets sent over to SharePoint Online and along with it some attributes of user A is also sent that helps identify user A as user A in the remote farm.  The attributes are UPN, SMTP address, SIP address, and name identifier. The user attributes from SharePoint On-Premise are synchronized to SharePoint Online Directory services using DirSync. Rehydration process for users will include any group memberships we find in the UPA. It does not matter on how a user would authenticate, it is all based on

      · What identifying claim is sent to SharePoint and,

      · What profile is found in the UPA that matches that identifying claim because we will grab any group memberships based on that profile.

      To rehydrate a user’s identity, SharePoint takes the claims from the incoming access token and resolves it to a specific SharePoint user. SharePoint 2013 expects only one entry in the User Profile service application for a given lookup query that is based on one or more of these four attributes. Otherwise, it returns an error condition that multiple user profiles were found. When SharePoint Online farm receives the request, it looks up the online profile store to match those attributes of user A and those are then passed along to display security trimmed results. This entire process is termed as rehydration of users. This has been described in great details in Steve Peschka's blog post here.

      Implementing Outbound Search (most common scenario)

      The walkthrough below is in my test environment where I will refer the below Urn’s and domain names.

      SharePoint On-premise intranet URL : http://spweb

      SharePoint Extranet URL : https://spweb.mbspoincloud.com

      SharePoint Online URL : https://mbspoincloud.sharepoint.com

      On-premise domain: mbspoincloud.com

      Test User: manas@mbspoincloud.com

      SharePoint Onpremise

      For this post we are going to look at a user manas in the mbspoincloud domain (manas@mbspoincloud.com) who should be able to search for results both from SharePoint On-premise and SharePoint Online. To configure Hybrid Search you will need a SharePoint 2013 single server or a farm Standard or Enterprise CAL with the below services enabled . For this post we will consider the SharePoint On-premises URL is accessible with http://spweb . Following services must be enabled and configured in the on-premise SharePoint farm to support hybrid functionality:

      · User Profile Service

      · App Management Service

      · Subscription Settings Service

      · Search Service Application

      It is critical that your On-premise farm has User Profile Service up and running. It is also required that it is populated with current data from Active Directory. UPA on the local farm is used to determine what rights a user has, what claims they have, what groups they belong to, etc. Subscription Settings Service and App Management Service Application Apps are dependent on App Management and Subscription Settings service applications. These are a prerequisite for SharePoint Online to be a registered as a high-trust app in SharePoint Server 2013.

      image

      Add On-premise Domain to Office365

      One of the key requirement for Hybrid Search to work and return results from both SharePoint Online and SharePoint On-premise is to ensure the user identity can be matched in both of this verticals as I have explained above . The first steps would be to add your On-premise domain as a vanity domain in your Office 365 subscription, validate that you are the owner of the Domain and then synchronize the users to Windows Azure Active Directory. As mentioned above the On-premise domain for this post is mbspoincloud.com. You need to follow the steps below to ensure you can add the domain to your Office 365 subscription.

      Log in to your Office 365 subscription with a Global admin account .From admin center, click domains.

      image

      Click Add a domain.

      image

       

      Click Specify a domain name and confirm the ownership or start step 1.

      image

       

      Enter the domain name of your On-premise domain . Note only publically registered and publically routable domains can be added . In case you are testing this in your labs where you have a .local domain you can use the concept of alternate UPN suffix and use a publically routable domain. For this post we will add our domain mbspoincloud.com and click Next

      image

       

      For verification, (Office 365 wants to ensure you own the specified domain), a DNS record (either TXT or MX) has to be created according to displayed information. This step has to be done in the DNS configuration of the DNS registrar where the domain is registered. On the page, you need to select General instructions from the drop-down list and get the unique TXT record for your domain. You would then need to update this TXT record in the DNS where your domain has been registered. Office 365 provides you with step by step instructions on how to update the records for a couple of DNS providers listed on the same page.

      image

       

      After adding the DNS record, it can take up to several hours until the DNS was updated on the Internet. Wait at least 15 minutes before you click Done, verify now. Once you get a confirmation and click Next, you should be in a page identical to the following screenshot.

      image

       

      Click Start Step2, select the I don't want to add users right now optionand then click Next. This is because our goal is to synchronize On-Premise users to cloud.

       

      image

      On the next screen, select SharePoint Online as your Domain Indent. You set a domain Indent or purpose for your domain when you are adding it to Office 365. This is a selection of services that you plan to use with your domain. Later, Office 365 uses this to show you the list of just the DNS records (such as an MX record) that you will need to create or update at your DNS hosting provider website, so that Office 365 services will work. By knowing the purpose, Office 365 can show you only the DNS records that apply to you, keeping the list shorter and less confusing.

      image

       

      Click Next. This will take you to the Design your new public website in office 365 page. Since designing public website is beyond the course of this post, continue to click Next till you reach the page identical to the following screenshot. On the next page, click Finish.

      image

       

      You should be redirected to the Domain Management page on your Office365 Admin Centre and your domain should show as Active.

      image

      Activate and configure DirSync

      Next step would be to activate, install and configure DirSync to synchronize on-premise Active Directory users to Cloud directory services. You can’t install DirSync on a Domain Controller , hence you would need to install it on a member server . As I have mentioned above Directory synchronization is the synchronization of directory objects from your on-premises Active Directory to Windows Azure Active Directory for your Office 365 subscription. You can read more about Directory Synchronization in the article below

      Plan for directory synchronization for Office 365

      The very first step would be to Activate Active Directory Synchronization for your Office 365 environment. To do so follow the steps below

      • Browse to Office 365 Admin center https://portal.microsoftonline.com.
      • On the Microsoft Online Services page, in the Windows Live ID field, provide an account name that has  Global admin rights on your Office 365 subscription
      • On the Home page, click Admin.
      • On the Admin page, select Users, which is under the Management section on the left side.
      • Select the Set up option next to Active Directory Synchronization and click on Activate .
      • Activation may take a couple of hours and once completed you should see "Active Directory synchronization is activated".

       

      image

      Next step is to install Active Directory Synchronization Tool. This needs to be installed in a member server on your On-premise environment. You cannot install DirSync tool on a domain controller .Once you have validated that Active Directory Synchronization shows as activated you can download DirSync version 6382.0000 or greater from within the Portal itself . You would need a dedicated member sever in your On-premise domain to install DirSync.

       

      On the same page where you validated Active Directory Synchronization is set to yes you will see an option in Step 4 to install and configure Directory sync tool

       

      image

       

      Note : It’s recommended to go through the list of best practices listed here before you configure Dirsync on your domain.

      Click on Download and save the file locally to your desktop and once downloaded start installing the same. The installation is quite simple and does not require any specific user input except for the install location.

      Next step would be to synchronize Active Directory

      • Log on to the machine that you installed DirSync.
      • Click Start, click All Programs, click Microsoft Online Services, click Directory Synchronization, and then click Directory Sync Configuration.
      • On the Welcome page, click Next.
      • On the Microsoft Online Services Credentials page, in the User name field, type the details of an account that has Global Admin rights on your Office 365 subscription.

      image

      On the Active Directory Credentials page, in the User name field, type an account with administrator permissions on your organization’s Active Directory service. These credentials will be used to set the permissions for Directory Sync tool on your On-premise Active Directory.

      image

       

      Click Next on the Hybrid Deployment option. If this is grayed out do not worry , this means that your Onpremise environment does not have Exchange installed and the AD schema has not been extended with Exchange attributes .This is to enable write back options for Exchange. The most important step to enable password sync is on the next screen.

      On the Password Synchronization page, ensure Enable Password Sync is checked.

      image

       

       

      On the Configuration page, click Next.

      image

       

      On the Finished page, verify that the Synchronize directories now check box is selected and then click Finish.

      Verify DirSync

      As I mentioned earlier DirSync is a FIM based tool so you should be able to use your favorite MIIS client to validate errors and look what what is being synchronized. To check if the DirSync is working correctly, go to C:\Program Files\Microsoft Online Directory Sync\SYNCBUS\Synchronization Service\UIShell\miisclient.exe and check accounts getting synchronized.

      image

       

      Alternatively you can verify DirSync from your Office365 Dashboard itself from the Users and Groups section. This is a mandatory step as the users that are synchronized needs to be Activated and licenses needs to be assigned. To do so follow the steps below.

      Launch your browser and in the Address field, type https://portal.microsoftonline.com and then press Enter.

      When asked to authenticate, log on with your Office365 Global Admin account. In the left navigation pane, under User Management, click Users. If the synchronized users do not appear, refresh the browser window.

      On the users’ page you should see a list of users that has been synchronized from within your On-premises Active Directory and will show up as “Synched with Active Directory” under the status section.

      Select the list of users you want to activate example manas@mbspoincloud.com and click Activate Synced Users.

      image

       

      You would need to select a location and license option, click Next, and then click Activate. A successful activation should take you to a screen similar to the following screenshot.

      image

       

      The next step is to add the user manas@mbspoincloud.com onto your Sharepoint Online site. Browse to your SharePoint Online site collection that you would want to configure to be able to return search results when searched from SharePoint Onpremise . Click on gear icon and click on Site settings and add the user that you have synchronized from your Onpremise Active Directory. You can choose to add the user manas@mbspoincloud.com to any group but I would go with Visitors group.

       

      image

      The next step would be to login to with the account that you have synchronized from your On-premise Active Directory . Browse to SharePoint Online URL example https://mbspoincloud.sharepoint.com and when prompted for credentials type in manas@mbspoincloud.com and the password the user uses to sign in to on-premise. You should be able to sign and access SharePoint Online site collection. This confirms that your On-premise user has been successfully synched to Azure Active directory , has the correct set of licenses and also should be in your profile database of SharePoint online via the Profile import that automatically picks up the users from Online active directory.

      At this stage I would also recommend to upload a few documents to your SharePoint online site collection so that the continuous crawl crawls the document. Ensure that you are able to search for the document as well with the logged in user in SharePoint online . This is just to ensure that the an on-premise user can successfully log in with his same credentials to SharePoint online and search for contents .

      So lets look at the bigger picture what we have achieved till now . User Manas who is a user in domain mbspoincloud.com can log in to his corporate network and access his corporate portalhttp://spwebwith his username manas@mbspoincloud.comand his password . The company has a corporate portal in Office 365 Sharepoint Onlinehttps://spoincloud.sharepoint.com .Since his IT department has installed and configured DirSync with password sync he is now able to access the corporate portal with same credentials as On-premise .He is very happythat he does not need to remember two set of credentials but he still has a challenge . He does not get unified search experience between these two environments . So he at this stage still needs to remember where a document resides i.e Online or On-premise in order to find it . This is where Hybrid comes into play . Lets now take a look at the next set of configurations.

      Establish Server-to-Server Trust with Windows Azure ACS

      To configure server-to-server authentication for hybrid environments, you have to establish trust with ACS, the trust broker for both the on-premises and online SharePoint servers. In order to establish Server-to-Server trust with the Windows Azure ACS, the certificate of the Security Token Service (STS) of the SharePoint Server 2013 has to be replaced. By default, SharePoint uses a self-signed certificate for its Security token services communication but the same certificate cannot be used by ACS which will act as a trust for server to server authentication. Its recommended to use self signed certificate for this purpose . Next set of steps will help you to generate a self signed certificate and then upload to SharePoint online. Note : My SharePoint on-premise is a single server , else you need to replace the STS certificate across all SharePoint servers in your SharePoint farm.

      Navigate to C drive and create a folder called cert. This folder would be used to store the certificates that would be required throughout the setup.

      The first step would be to create a self-signed certificate. There are several ways to create a self-signed certificate, one possibility is to use the Internet Service Manager of IIS 7 or higher.

      • Launch Internet Information Services (IIS) Manager.
      • In the MMC console tree, click the server name.
      • In the Details pane, double-click Server Certificates in the IIS group.

      image

       

      In the Actions pane, click Create Self-Signed Certificate.

      image

       

      On the Specify Friendly Name page, type a name for the certificate, and then click OK (the name is free of choice).

      image

       

      In order to replace the STS certificate, the certificate is needed in Personal Information Exchange (PFX) format including the private key. In order to set up a trust with Office 365 and Windows Azure ACS, the certificate is needed in CER Base64 format. This means, for our scenario, the certificate is needed in both formats PFX and CER.

      To export the self-signed certificate in PFX format:

      · In the Details pane, right-click the new certificate and then click Export.

      · In Export Certificate, specify a path and name to store the .pfx file for the certificate in the Export to field.

      · Type a password for the certificate file in the Password and Confirm password fields. This creates the certificate in PFX format containing the private key. For our lab, the parameters for step 4 and step 5 should be specified as following:

        •  Path: C:\cert folder
        •  File Name: stscert.pfx
        •  Password: LS1setup!

      image

       

      The next step would be to export the self-signed certificate in CER Base64 format.In the Details pane, right-click the new certificate, and then click View.

      image

      Click the Details tab and then click Copy to File.

      image

      On the Welcome to the Certificate Export Wizard page, click Next.On the Export Private Key page, click Next.

      image

      On the Export File Format page, click Base-64 encoded X.509 (.CER), and then click Next.

      image

       

      On the File to Export page, type a path and file name for the .cer file, and then click Next. For our lab, the parameters for step 4 and step 5 should be specified as following:

        •  Path: C:\cert folder
        •  File Name: stscert.cer

      image

      On the Completing the Certificate Export Wizard page, click Finish, click OK, and then click OK again. This creates the certificate in CER Base64 format.

      On successful completion of the above steps, the C:\cert folder should have the following two types of certificates:

      · Stscert.pfx: In order to replace the STS certificate, the certificate is needed in Personal Information Exchange (PFX) format including the private key.

      · Stscert.cer: In order to set up a trust with Office 365 and Windows Azure ACS, the certificate is needed in CER Base64 format

      image

       

      Replacing the STS certificate and establishing a trust with the Windows Azure ACS has to happen through Windows PowerShell.

      Note:

      Before following the steps to establish a Trust with the ACS and the SharePoint On-Premise farm, two set of modules need to be loaded for Windows PowerShell:

      1. SharePoint 2013 Management Shell

      2. Microsoft Online Services Module for Windows PowerShell (Extended) [This requires also the Sign In Assistant to be installed first]

      Download SharePoint Online Management Shell

      1. Browse to http://technet.microsoft.com/library/jj151815.aspx.

      2. Download Windows Azure Active Directory Module for Windows PowerShell (64-bit version).

      The installation does not require any additional inputs. Follow the instructions on the screen to complete the installation process.Replacing the STS certificate and establishing a trust with the Windows Azure ACS would be the next step we need to follow. We recommend to execute all the Windows PowerShell cmdlets on a server acting as SharePoint Server 2013 Front-End (WFE). Therefore, the Microsoft Online Services module for Windows PowerShell has to be installed on the SharePoint Server. If this option is not available, the cmdlets can be executed on separate servers as well.

      Note: The following script will only work if executed from a SharePoint Server 2013 Front-End (WFE) where Microsoft Online Services Module for Windows PowerShell has been installed.

       

      In the following script, we assume the STS certificate is exported in the local folder C:\cert with a name of stscert.pfx and stscert.cer .Variables to be updated before using the script are:

      · $stscertpfx: The certificate as PFX file (including Private Key)

      · $stscertcer: The certificate as CER file

      · $stscertpassword: The password of the PFX certificate

      · $spcn: The CN the certificate was issued to

      · $spsite: SharePoint On-Premise Site Collection

      · $spoappid: This is always "00000003-0000-0ff1-ce00-000000000000"

      Launch SharePoint 2013 Management Shell. The first step would be to define the variables as shown below.

      #Variables

      $stscertpfx="c:\cert\stscert.pfx"

      $stscertcer="c:\cert\stscert.cer"

      $stscertpassword="LS1setup!"

      $spcn="*.yourdomainname.com" # replace yourdomainname with your onpremise domain that you added to Office 365

      $spsite="http://spweb"

      $spoappid="00000003-0000-0ff1-ce00-000000000000"

      #Update the Certificate on the STS

      $pfxCertificate=New-Object System.Security.Cryptography.X509Certificates.X509Certificate2 $stscertpfx, $stscertpassword, 20

      Set-SPSecurityTokenServiceConfig -ImportSigningCertificate $pfxCertificate

      # Type Yes when prompted with the following message.

      You are about to change the signing certificate for the Security Token Service. Changing the certificate to an invalid, inaccessible or non-existent certificate will cause your SharePoint installation to stop functioning. Refer to the following article for instructions on how to change this certificate: http://go.microsoft.com/fwlink/?LinkID=178475. Are you sure, you want to continue?

      #Restart IIS so STS Picks up the New Certificate

      & iisreset

      & net stop SPTimerV4

      & net start SPTimerV4

      #To Validate Certificate Replacement

      To validate that the above commands has run successfully, you can run any of the following cmdlets.

      $pfxCertificate

      (Get-SPSecurityTokenServiceConfig).LocalLoginProvider.SigningCertificate

      Compare the outputs of the above command to match the thumbprint, which confirms that the STS certificate has been successfully. Alternatively, you can follow the following step as well.

      Get-SPServiceApplication

      $stsSA=Get-SPServiceApplication | ? {$_.id -eq "5837da73-b393-444f-ae2c-ac057877df08"} (replace with the ID of the security certificate above)

      $stsSa.SigningCertificate

      Now, you should be able to match the value with the thumbprint of the self-signed certificate you replaced with.

       

      #Do Some Conversions With the Certificates to Base64

      $pfxCertificate = New-Object System.Security.Cryptography.X509Certificates.X509Certificate2 -ArgumentList $stscertpfx,$stscertpassword

      $pfxCertificateBin = $pfxCertificate.GetRawCertData()

      $cerCertificate = New-Object System.Security.Cryptography.X509Certificates.X509Certificate2

      $cerCertificate.Import($stscertcer)

      $cerCertificateBin = $cerCertificate.GetRawCertData()

      $credValue = [System.Convert]::ToBase64String($cerCertificateBin)

      #Establish Remote Windows PowerShell Connection with Office 365

      enable-psremoting

      #When prompted with Are you sure you want to perform this action? type Yes for all of the actions.

      new-pssession

      Import-Module MSOnline -force –verbose
      Import-Module MSOnlineExtended -force –verbose

      #Log on as a Global Administrator for Office 365

      Connect-MsolService

      When prompted, provide the Global Admin account for your Office 365 tenant. This would have been sent to your corporate e-mail address when you signed up for the tenant.

      #Register the On-Premise STS as Service Principal in Office 365

      New-MsolServicePrincipalCredential -AppPrincipalId $spoappid -Type asymmetric -Usage Verify -Value $credValue

      $SharePoint = Get-MsolServicePrincipal -AppPrincipalId $spoappid

      $spns = $SharePoint.ServicePrincipalNames

      $spns.Add("$spoappid/$spcn")

      Set-MsolServicePrincipal -AppPrincipalId $spoappid -ServicePrincipalNames $spns

      $spocontextID = (Get-MsolCompanyInformation).ObjectID

      $spoappprincipalID = (Get-MsolServicePrincipal -ServicePrincipalName $spoappid).ObjectID

      $sponameidentifier = "$spoappprincipalID@$spocontextID"

      #Finally Establish in the On-Premise Farm a Trust with the ACS

      $site=Get-Spsite "$spsite"

      $appPrincipal = Register-SPAppPrincipal -site $site.rootweb -nameIdentifier $sponameidentifier -displayName "SharePoint Online"

      Set-SPAuthenticationRealm -realm $spocontextID

      New-SPAzureAccessControlServiceApplicationProxy -Name "ACS" -MetadataServiceEndpointUri "https://accounts.accesscontrol.windows.net/metadata/json/1/" -DefaultProxyGroup

      New-SPTrustedSecurityTokenIssuer -MetadataEndpoint "https://accounts.accesscontrol.windows.net/metadata/json/1/" -IsTrustBroker -Name "ACS"

      Displaying Results from SharePoint Online

      After the trust is established in order for SharePoint Online results showing up on SharePoint Server 2013, additional configuration is needed on SharePoint Server 2013 On-Premise environment. As a rule of thumb, the configuration always needs to happen on the tier that should display the results from the other tier.

      Two steps are needed to configure Hybrid Search:

      1. Create a result source.

      2. Create a query rule.

      Our focus for this post is to look at configuration steps. You can read more about Result Source and Query rule here.

      New Result Source

      On the SharePoint box (SPWEB), browse to on-premise SharePoint site collection: http://spweb. Result Source configuration can happen on different levels: Global in the Search Service Application, Local per Site Collection or per Site Level.

      For this post, we will go with result source at site collection level.

      1. Browse to On-premise site collection http://spweb.

      2. In Site Settings, under Site Collection Administration, click Search Result Sources.

      3. On the Manage Result Sources page, click New Result Source.

      4. On the Search Result Sources page, do the following:

      a. In the Name text box, type a name for the new result source (for eg., SharePoint Online RS).

      b. For the Protocol, select Remote SharePoint.

      c. For the Remote Service URL, type the address of the root site collection of the Office 365 SharePoint Online environment whose results should be included (for example, https://mytenant.sharepoint.com).

      d. For the Type, select SharePoint Search Results.

      e. Leave Query Transform as default which is {searchTerms}.

      f. Leave Credentials Information as default which is Default Authentication.

      g. Click OK to save the new result source.

      Note:

      If you edit the result source, you should see the settings identical to the ones shown below.

      image

       

      The next step would be to create a new query rule.

      New Query Rule

      1. Open Internet Explorer and browse to http://spweb.

      2. In Site Settings, under Site Collection Administration, click Search Query Rules.

      3. In the Select a Result Source drop-down list, select the result source you created before (for example, SharePoint Online RS).

      image

       

      4. Click New Query Rule.

      5. In the General Information section, in the Rule Name box, type a name for the new query rule (for example, SharePoint Online QR).

      6. Click the Context link to expand the options.

      7. In the Context section, do the following:

      a. Under the Query is performed on these sources option, either select All Sources or One of these sources. If you select One of these sources, make sure to select the result source created before (here, it is SharePoint Online RS).

      image

       

      b. Leave the default selection for the Query is performed from these categories and Query is performed by these user segments options.

       

       

      8. In the Query Conditions section, click Remove Condition so the rule will fire for every query.

      9. In the Actions section, under Result Blocks, click Add Result Block.

      10. In the Edit Result Block dialog box, do the following:

      a. Leave the default for the Query Variables and Block Title sections.

      b. In the Query section, in the Search this Source drop-down list box, select the name of the result source that you created before (here, it is SharePoint Online RS). In the Items drop-down list, specify the number of items to show up as maximum (default is 2).

      c. Click the Settings hyperlink.

      d. In the Settings section, make sure the This block is always shown above core results option is selected.

      image

      e. Skip the Routing section and click OK to add the result block.

      11. Back at the Add Query Rule page, click the Publishing hyperlink.

      12. In the Publishing section, make sure the Is Active check box is selected.

      image

      13. Click Save.

      Note:

      Once you view the Query rule, it should look identical to the following screenshot.

       

      image

       

      Validating Your Search Configuration

      You can validate your search configuration and see the troubleshooting information with the following procedure:

      1. Open Internet Explorer and browse to http://spweb.

      2. In the Site Settings section, under Site Collection Administration, click Search Result Sources.

      3. In the Manage Result Sources page, click the result source you created in the previous procedure (for example, SharePoint Online RS).

      4. In the Edit Result Source page, click Launch Query Builder.

      5. In the Build Your Query page, select the Test tab.

      6. Click the Show more hyperlink.

      7. Type a search term of your choice in the textbox next to {subject terms} and click Test Query (Hint: “*” is also a valid search term).

      Relevant search results will be displayed in the Search Result Preview window if your configuration is valid. If there are problems with your configuration, troubleshooting information will be displayed. We will talk about some common error message during the troubleshooting section of this content.

      Validate Search Results

      The next step would be to validate if the user is able to retrieve search results from the other SharePoint Online.

      Browse to http://spweb. Change the search box to select Everything.

      image

       

      Within the search text box, type *. The search result should be displayed from both verticals (online, on-premise) identical to figure below. Click one of the document from SharePoint Online and you should be redirected to the sign in page for SharePoint Online. Enter the username as and password for the user that you have earlier synched using Drsync (example manas@mbspoincloud.com) .User should be signed in and should be able to access the document.

      image

       

      Note this experience will change when we deploy a ADFS server on-premise and establish a trust and thus have a IdP (Identity Provider) and RP (Relying Party) to provide end user with Single Sign on experience. We will install ADFS and take a look at SSO experience in the part3 of this post.

      Please watch this Space for Part 2 of this series which would be coming soon!!!!

       

      POST BY :  MANAS BISWAS [MSFT]

      SharePoint 2010 Using blob caching throws many errors in the ULS and Event Logs -The system cannot find the file specified

      $
      0
      0

       

      Are you seeing several errors related to Blob Caching in the ULS Logs / Event Viewer?

      Various ULS Logs errors:

      · "An error occurred in the blob cache. The exception message was: 'The system cannot find the file specified. (Exception from HRESULT: 0x80070002)'."

      · GetFileFromUrl: FileNotFoundException when attempting get file Url /favicon.ico The system cannot find the file specified. (Exception from HRESULT: 0x80070002)    at Microsoft.SharePoint.Library.SPRequestInternalClass.GetMetadataForUrl(String bstrUrl, Int32 METADATAFLAGS, Guid& pgListId, Int32& plItemId, Int32& plType, Object& pvarFileOrFolder)     at Microsoft.SharePoint.Library.SPRequest.GetMetadataForUrl(String bstrUrl, Int32 METADATAFLAGS, Guid& pgListId, Int32& plItemId, Int32& plType, Object& pvarFileOrFolder)     at Microsoft.SharePoint.SPWeb.GetListItem(String strUrl, Boolean bFields, String[] fields)     at Microsoft.SharePoint.Publishing.CommonUtilities.GetCurrentFileVersionFromUrl(String url, SPWeb web)

      · Error in blob cache. System.IO.FileNotFoundException: The system cannot find the file specified. (Exception from HRESULT: 0x80070002)     at Microsoft.SharePoint.Library.SPRequestInternalClass.GetMetadataForUrl(String bstrUrl, Int32 METADATAFLAGS, Guid& pgListId, Int32& plItemId, Int32& plType, Object& pvarFileOrFolder)     at Microsoft.SharePoint.Library.SPRequest.GetMetadataForUrl(String bstrUrl, Int32 METADATAFLAGS, Guid& pgListId, Int32& plItemId, Int32& plType, Object& pvarFileOrFolder)     at Microsoft.SharePoint.SPWeb.GetListItem(String strUrl, Boolean bFields, String[] fields)     at Microsoft.SharePoint.Publishing.CommonUtilities.GetCurrentFileVersionFromUrl(String url, SPWeb web)     at Microsoft.SharePoint.Publishing.CommonUtilities.GetPublishedFile(SPWeb web, ULSCat ulsCategory, String fileUrl, SPFile& file, SPListItem& item)     at Microsoft.SharePoint.Publishing.BlobCache.<>c__DisplayClass34.<>c__DisplayClass37.<FetchItemFromWss>b__31()     at Microsoft.Office.Server.Diagnostics.FirstChanceHandler.ExceptionFilter(Boolean fRethrowException, TryBlock tryBlock, FilterBlock filter, CatchBlock catchBlock, FinallyBlock finallyBlock)

      · Unable to cache URL /FAVICON.ICO.  File was not found

      Event Log error:

      Log Name:      Application

      Source:        Microsoft-SharePoint Products-Web Content Management

      Date:          7/30/2013 1:57:45 PM

      Event ID:      5538

      Task Category: Publishing Cache

      Level:         Error

      Keywords:     

      User:          CONTOSO\SPSvc

      Computer:      SP.contoso.com

      Description:

      An error occured in the blob cache.  The exception message was 'The system cannot find the file specified. (Exception from HRESULT: 0x80070002)'.

      Event Xml:

      <Event xmlns="http://schemas.microsoft.com/win/2004/08/events/event">

        <System>

          <Provider Name="Microsoft-SharePoint Products-Web Content Management" Guid="{0219F589-72D7-4EC3-ADF5-1F082061E832}" />

          <EventID>5538</EventID>

          <Version>14</Version>

          <Level>2</Level>

          <Task>1</Task>

          <Opcode>0</Opcode>

          <Keywords>0x4000000000000000</Keywords>

          <TimeCreated SystemTime="2013-07-20T18:57:45.255293200Z" />

          <EventRecordID>308352</EventRecordID>

          <Correlation />

          <Execution ProcessID="3052" ThreadID="2236" />

          <Channel>Application</Channel>

          <Computer>www.contoso.com</Computer>

          <Security UserID="S-1-5-21-1385174992-979951090-295046656-1108" />

        </System>

        <EventData>

          <Data Name="string0">The system cannot find the file specified. (Exception from HRESULT: 0x80070002)</Data>

        </EventData>

      </Event>

      Status:

      We are aware of this issue and are currently investigating.

       

       

      Blog By : Vijay Gangolli [MSFT]

      Viewing all 170 articles
      Browse latest View live


      <script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>