Please Stay Out Of The Database!!!
We had a fun internal debate this week on a new tool developed in France. If you can read French, it’s documented here; the download link for the executable (whose UI is in English) is here. The tool grabs deleted documents from the database and saves them to your local hard drive.
I’m far more in favor of undelete techniques that are user-accessible and use our object model to do what they do. Commercial products like SmartLibrary (to name one of several) or roll-your-own code examples are available. But this tool is (a) free, (b) only reads from the database — the only writing it does is to your hard drive, (c) is only really usable by an administrator. In an emergency, it may be a good last-resort tool. I haven’t used it myself, so I can’t really comment on whether it will work or not.
I’d be tempted to say it at least can’t hurt, but one of our developers reminded me that even reading from the database can indeed have negative effects on performance and stability. Just to name two such effects, I’ll paraphrase his comments:
- Deadlocks: direct SELECT statements against the database take shared read locks at the default transaction level, and can violate our lock hierarchy, leading to seemingly random failures.
- Incorrect data: if you avoid taking locks to avoid a deadlock, you’ll very likely observe the database in an intermediate state and retrieve incorrect data.
Hopefully I don’t need to remind anyone of the dangers of writing to our database, but those risks should be far more obvious.
And it’s not that, if you wind up writing code that communicates with our database, I’m going to hate you. It’s that (and Microsoft) won’t be able to help you. It’s that you will run the risk of breaking or degrading the performance/stability of technology we built to be fast and robust. The odds of a database change in our next Service Pack aren’t small. It’s that you’ll on your own if anything goes wrong. Consider this carefully.
Our object model is a contract with you. We’ve gone to a lot of trouble to ensure that using the OM results in stable and performant database interactions. If anything’s amiss, we’re honor-bound to fix it. Please, please use it.
Comments
Anonymous
April 01, 2005
I'm a friend of Stephane, the author of this tool.
This technical guy is really huge fan of all SharePoint WSS/SPS stuf and he really knows how to deal with
Sorry that every people don't read french, but the real purpose of this tool is not to "connect" SharePoint DB to extract document
>>> this is a tool for emergency only
I explain : in certain case : frontpage manipulation, IIS failure or whatever critical, the sharePoint interface can be broken. It possible too that you don't have time to restore a full WSS (or fear to lost some upgrade )
This tool offer the possibility to ADMIN people to connect the DB or a copy/backup DB to get back document
Imagine this situation : an user just delete his doc in a huge SPS site. You can restore a backup version of the SPS DB on a separate SQL Server and with the tool, can process an "recovery" and send it back to his author witout an FULL SPS restore.
Stephane has create this tool for help SharePoint Community to have less IT/restore problem like i do with Granular/GuiSmigrate tool.
Otherwise, you have totaly right : a lot of people prefer to operate directly in DB than learn to use all the OM and/or CAML (hummm CAML :) )
So many times, i have to say not to process SQL acess directly in SPS/WSS dB in newsgroup ...
Now there is an post wich explain CLEARLY that modify db proc or act directly in DB is definitely NOT a good solution. Thanks !Anonymous
April 03, 2005
Renaud, this particular tool is (a) probably a fine addition to any admin's toolbox, (b) something I, given my job, can't endorse no matter how good it might be because it uses an unsupported technique, (c) something that spurred a few more general comments about database access that bore repeating, (d) not something admins should think absolves them of the need to invest in a recycle bin solution.Anonymous
April 04, 2005
Stephane Cordonnier hat ein Tool SharePoint Document Recovery verAnonymous
April 04, 2005
Stephane Cordonnier hat ein Tool SharePoint Document Recovery verAnonymous
April 04, 2005
I understand
Your point of view is clearly now
thanks for allAnonymous
April 05, 2005
Not sure why MS is so big onstoring the .aspx page as BLOB in database??? Woudlnt performance improve if they are stored as flat file? The BLOB seems ok with 100-200 pages but if your site has 50000 .aspx pages blob...it really hurt the performance. I understand the security issues with storing as flat file - but I am not convinced with the idea of BLOBAnonymous
April 05, 2005
Do you know the "ghosting" concept ? :-)Anonymous
April 05, 2005
I've had this listed in the WSS FAQ for almost half a year.
The item at
http://www.asaris-matrix.com/mvp-walsh/Lists/WSS%20FAQ/DispForm.aspx?ID=316
includes a brief text (in English) given me at the time by the writer.Anonymous
April 10, 2005
The comment has been removedAnonymous
April 11, 2005
I understand your point of view.
But I have a special problem and I don't know how to solve it without writing to the database:
I created a worklfow-engine (using the object model). Nearly everything is fine. When the workflow-engine changes somethins (e. g. copies a document to another doc-lib) the fields last modified and last modified by are changed to the current time and the useraccount which the worklflow-enginge uses.
And everything (political correct ;-))I trieed doesn't work. My solution: a little stored procedute changes the values of the fields to what I want.
If you have other suggestions I would be glad to change it back to the standard...Anonymous
April 12, 2005
Hi Mike,
I strongly recommend that MS not publish detailed information on the database tables in the next version of the SDK.
"If you document it, they will come"Anonymous
April 18, 2005
Hi Mike!
Thank you for clarifying this. It is important for me as a consultant to get this kind of input from you. You often meet customer demands and wishes in this area - and one are often confronted with articles like this:
http://msd2d.com/newsletter_tip.aspx?section=sharepoint&id=6fbc9145-cc08-4a5b-a7d7-14b37e014672Anonymous
April 19, 2005
I guess this is as good a blog as any to make a plea for a new site management model that works with Sharepoint deployments involving large number of sites.
We have a deployment with currently 30,000+ sites growing at aobut 20,000 / year. Most of these sites are based on the same template.
We hope and expect this application to have a long lifetime during which we will extend the UI, add additional document libraries, etc. and we want to use the powerful design tools offered by FP to design our sites.
Our problem is: How to update our existing (30,000) sites with an updated template?
We have grappled with ghosting issues, etc. read all the stuff about the ghosting debate and in summary the only way we have found to do this is to re-create sites whenever they get re-visited: We test the template version that they were built on and if out of date then we copy all the content to a temp site, delete the source site and re-create it then we copy all the content back...
Truly horrible and - to get to the point of this blog - once the site contains a lot of content and versioning is on we get into all kinds of problems with versioning not being copied over, metadata being overwritten and it takes a long time to copy all these files...
So to do the right thing and use the API we need an SPFile.MoveTo that works across sites in different site collections (and copies all metadata and versions and is fast, so presumably does not actually move any data just attributes are updated...)
ANd the much bigger issue is that we hope we are not the only ones trying to roll-out and administer large Sharepoint depolyments were there is a need to centrally manage site updates, etc. because it is proving a real nightmare for us!Anonymous
June 01, 2005
I am dealing with the same problem as Patrick Fischer in a smaller scale: our Operations department needs to update several site templates often, and there is no easy way to do it.
And I have found it difficult to get good documentation.
So we are very tempted to write directly to the database :-)Anonymous
June 26, 2005
If you want to modify a WSS or Portal area template which will affect all existing sites then you have to modify the file system template files. The other way is to use FrontPage to manually modify all the sites, but this is very time consuming and unghosts the pages. If the files are unghosted then you will need to use a tool like "Ghost Hunter" to re-ghost them.
Discusses about the Ghost, Unghost issue:
http://www.codeguru.com/Cpp/misc/misc/tools/article.php/c9581/
The "Ghost Hunter" web part:
http://www.bluedoglimited.com/Downloads/pages/Web%20Part%20Toolkit.aspx
Other useful tools:
http://www.google.com.au/search?q=SharePoint+Ghost+hunter+webpart&hl=en&biw=Anonymous
August 31, 2005
If is available, USE IT.
Thats the reason why we are developers , consultants and not users.
Please dont tell me , I can't MOD my XBOX or
put not recommended brake pads in my Car.Anonymous
September 12, 2005
Mike Fitzmaurice has provided a very nice explanation about the boundaries for Microsoft Support. A very...Anonymous
December 01, 2005
Excuse me, but I dont't see how the shared locks of SELECTs can provoke deadlocks, since they are released when each select is done at the (default) read committed isolation level.
I'm OK it can cause timeouts if a SELECT runs too long (which is not a good thing either), but can't figure how to cause a deadlock this way. Is it related to the sequence of shared locks on multiple tables acquired by a single SELECT with joins ? I thought that the locks on the upper DB hierarchy (table, db) were used to protect against this problem.
Can you explain your statement about the deadlocks ?Anonymous
February 15, 2006
Hey Mike,
Is there a KB article somewhere that states that direct database updates are not supported?
Thanks
CAnonymous
February 18, 2006
I don't know what the heck is up with the blog service here, but I can't for the life of me, get...Anonymous
May 16, 2006
This blog entry is too old, and comments are far too many to expect an individual reply at this point in time. Since comments are still open, will take my chances, and ask my questions about support, as they are important.
1) If we use a third party web part, what is the best way to tell weather it is using a "direct database" approach, or OM, SQL Profiler ?
2) What are the consequences of using 3rd party sharepoint web parts as it goes to "breaking Microsoft support"? Any reference material?
2) What would be the criteria(s) for "Microsoft Support" to claim a case to be "Not supported due to direct database modifications" ? Are there any cases like that?
Would highly appreciate answers, these will effect a lot of buying decisions, as well as third party tool development.
Thank youAnonymous
January 05, 2008
PingBack from http://birthdays.247blogging.info/?p=3502Anonymous
January 05, 2008
PingBack from http://birthdays.247blogging.info/?p=3564Anonymous
May 30, 2008
PingBack from http://mikefitzmaurice.wordpress.com/2008/05/30/holy-spit-they-actually-built-it/Anonymous
November 28, 2008
Comme vous le savez certainement, l' ensemble des données du fonctionnement même de SharePoint 2007 résideAnonymous
January 22, 2009
PingBack from http://www.hilpers.fr/942406-comment-extraire-linformation-depuis-sql