schemazen
schemazen copied to clipboard
Option to Export Table Data as Insert Statements [Feature Request]
This would allow someone to be able to completely restore the database manually without the use of SchemaZen.
Hi, thanks for the suggestion. I agree that there are circumstances in which it could be useful to be able to manually restore a database without SchemaZen. However, in my experience, if a table has many thousands of rows, and "insert" statements are used to re-create the table, the SQL script becomes absolutely massive and SQL Server Management Studio chokes trying to parse/syntax highlight and execute it...
In my opinion, it would be better to use the "import data" feature of SSMS, or indeed, the bcp utility (with an appropriate "format file") to get the data into the table, rather than using insert statements. I realize that this might not be particularly easy with the current format used by SchemaZen, but I think would still be doable.
But it's not my decision to make, it's @sethreno's project :)
I think @keith-hall has a good point about bcp. I originally chose the tab separated format because there's less noise in the files, but I'm not opposed to the idea of using insert statements if other people find it useful.
I know what you mean about Management Studio. I usually use SQLCMD to execute large scripts. Not as fast as BCP, but not as fiddly to set up, and doesn't require manually stepping through the import wizard. This isn't something that I immediately need, was just something I thought might be useful in general.
If you are interested I actually used schemazen and bcp (strung together with Posh) to do this same thing, I have used bcp many times in the past for bulk loading and I had a corrupt database which was only corrupt in the metdata, but could not be repaired in any fashion and had to be recreated and scripted out. Unfortunately SSMS and SMO in general crashed whenever attempting to script out the objects, so this project was a lifesaver!
Basically I ran the existing options in schemazen to output the objects, created a new db from them, ran a query on sys.tables on the old database to fill in these boxes(I can provide this if it is all useful):
(export) "select * from [SOURCEDBNAME].[SCHEMANAME].[TABLENAME]" queryout "FOLDERNAME-TableName.bcpfile" -N -S SOURCESERVERNAME -T -E'
(import 10k rows at a time, easily adjustable) "[DESTDBNAME].[SCHEMANAME].[TABLENAME]" in "FOLDERNAME.bcpfile" -N -S DESTSERVERNAME -T -E -b 10000'
Thanks for you message. It is useful to know that it can be done this way too. I use this as part of my automated build process so with a bit of fiddling could get these alternative scripts up and running, though I suspect it'll take a problem like yours to get me to eventually be forced to do it!
Cheers, Josh
On Wed, Mar 30, 2016 at 3:32 PM, hobs [email protected] wrote:
If you are interested I actually used schemazen and bcp (strung together with Posh) to do this same thing, I have used bcp many times in the past for bulk loading and I had a corrupt database which was only corrupt in the metdata, but could not be repaired in any fashion and had to be recreated and scripted out. Unfortunately SSMS and SMO in general crashed whenever attempting to script out the objects, so this project was a lifesaver!
Basically I ran the existing options in schemazen to output the objects, created a new db from them, ran a query on sys.tables on the old database to fill in these boxes(I can provide this if it is all useful):
(export) "select * from [SOURCEDBNAME].[SCHEMANAME].[TABLENAME]" queryout "FOLDERNAME-TableName.bcpfile" -N -S SOURCESERVERNAME -T -E'
(import 10k rows at a time, easily adjustable) "[DESTDBNAME].[SCHEMANAME].[TABLENAME]" in "FOLDERNAME.bcpfile" -N -S DESTSERVERNAME -T -E -b 10000'
— You are receiving this because you authored the thread. Reply to this email directly or view it on GitHub https://github.com/sethreno/schemazen/issues/58#issuecomment-203201744
I've seen BCP choke on some things as well-- also note that you can't use BCP to import the data files produced by SchemaZEN as they have some custom null flags. The problem I had with BCP though, even when the export was produced by BCP, I was getting duplicate key errors on the import of a table that didn't actually have any duplicate keys-- not sure what that was about...