SharePoint 2013 - Export Index a la Crawl Log

I ran into an issue where I was replacing a competitor's search service.  I needed to provide validation that SharePoint indexed the same data and used the crawl log to get a list of all the items in SharePoint.  Here's the code:

$ssa = Get-SPEnterpriseSearchServiceApplication

$cl = New-Object Microsoft.Office.Server.Search.Administration.CrawlLog $ssa

$cl.GetCrawledUrls($false,1000000,"",$false,-1,0,-1,[datetime]::minvalue,[datetime]::maxvalue) | export-csv -notype successes.csv # This will likely be huge

$cl.GetCrawledUrls($false,1000000,"",$false,-1,1,-1,[datetime]::minvalue,[datetime]::maxvalue) | export-csv -notype warnings.csv

$cl.GetCrawledUrls($false,1000000,"",$false,-1,2,-1,[datetime]::minvalue,[datetime]::maxvalue) | export-csv -notype errors.csv

Then just a quick ETL to SQL for SP's index, a full outer join and done.

Comments

  • Anonymous
    January 01, 2003
    Great post. Thank you.
  • Anonymous
    January 01, 2003
    Thx for this command. Unfortunatly, it appears this method will be removed in upcoming versions:https://msdn.microsoft.com/fr-fr/library/jj264492.aspx
  • Anonymous
    December 30, 2014
    Thanks a lot for this article. it worked...