Export log for a crawl on a shared drive

Sep 18, 2009 at 2:39 PM

I was able to run the command successfully to export the log for a crawl performed on our website.  However, I have configured a shared drive as a content source on which crawl is run periodically.  When I try to export the log for this content drive, I get error message

Here is the command I used.

C:\Program Files\Common Files\microsoft shared\Web Server Extensions\12\BIN>stsadm -o ExportCrawlLog -t d -site "//co-shr-w/SHRVOL/SHARED/" -history -outfile c:\downloads\crawl.log -s "2009-09-18 07:30:00" -e "2009-09-18 08:00:00" -mt w

and the error message I got is:

The Web application at //co-shr-w/SHRVOL/SHARED/ could not be found. Verify that you have typed the URL correctly. If the URL should be serving existing content, the system administrator may need to add a new request URL mapping to the intended application.

I also tried

C:\Program Files\Common Files\microsoft shared\Web Server Extensions\12\BIN>stsadm -o ExportCrawlLog -t d -site "file://co-shr-w/SHRVOL/SHARED/" -history -outfile c:\downloads\crawl.log -s "2009-09-18 07:30:00" -e "2009-09-18 08:00:00" -mt w

but got the same error message.

Can you please suggest how to do it right?

 

Thanks in advance.

 

 

 

Coordinator
Sep 18, 2009 at 3:16 PM

Unfortunately, I think you have uncovered a defect in the tool. I must admit that exporting log messages from a file share content source is not accomodated in the current implementation.  You will notice in the source code at line 42 of ExportCrawlLog.vb that the code there assumes that the -site parameter refers to a SharePoint site in farm.  This is a design problem with the tool and so unfortunately I can't offer a work around with the current version.

Sep 18, 2009 at 4:00 PM

I am not a developer.  Can't comment on the code, but with some exploration of the options, found a way to get it done.  I used the -hostname option to acheive my results

C:\Program Files\Common Files\microsoft shared\Web Server Extensions\12\BIN>stsadm -o ExportCrawlLog -t d -site "http://XXYXX/" -history -outfile c:\downloads\crawl.csv -s "2009-09-12 07:20:00" -e "2009-09-18 08:00:00" -hostname co-shr-w -mt e

To get the errors on the actual sharepoint site i had to use

C:\Program Files\Common Files\microsoft shared\Web Server Extensions\12\BIN>stsadm -o ExportCrawlLog -t d -site "http://XXYXX/" -history -outfile c:\downloads\crawl.csv -s "2009-09-12 07:20:00" -e "2009-09-18 08:00:00" -hostname XXYXX -mt e  

To get a consolidated error log just exclude the -hostname option.  That does the trick

Double thumbs up to lkuhn.  Not a defect.  It was an operator error.

Thanks.

 

Coordinator
Sep 18, 2009 at 11:08 PM

Great news - thanks for following up with your findings.