Sql Server 2005 BCP Command Help

PTNL

Supreme [H]ardness
Joined
Jan 2, 2005
Messages
4,199
I'm trying to dump a Sql Server 2005 table's contents into an Excel file.

From a "cmd" prompt, I'm trying the following command:
Code:
bcp "USE Websites; SELECT * FROM WebsiteNames" queryout "C:\XLS\names.xls" -c -CRAW -T

I'm getting a list of errors from the BCP command:
Code:
SQLState = 42S02, NativeError = 208
Error = [Microsoft][SQL Native Client][SQL Server]Invalid object name 'WebsiteNames'.
SQLState = 42000, NativeError = 8180
Error = [Microsoft][SQL Native Client][SQL Server]Statement(s) could not be prepared.

Oddly, when running this command using the "exec xp_cmdshell" line, I get a bunch of different errors returned from SSMS:
Code:
NULL
Starting copy...
SQLState = HY000, NativeError = 0
Error = [Microsoft][SQL Native Client]Unable to resolve column level collations
SQLState = 01000, NativeError = 5701
Warning = [Microsoft][SQL Native Client][SQL Server]Changed database context to 'WebsiteNames'.
NULL
BCP copy out failed
NULL

The "WebsiteNames" table does exist on the database ("Websites"), and it is a local database that I'm running on my local Sql Server installation. My machine is on a domain and I have local machine admin rights, as well as the local machines "sa" account (if that makes a difference versus Trusted Authentication).

Anyone have any ideas?
 
Not to toot my own horn, but I think you'll have an easier time using SQL Server Integration Services. It has an Excel adapter, and can write a native Excel file rather than the delimited file that you're storing as an XLS file and expecting Excel to parse and load into a spreadsheet.

Anyway, BCP lets you specify a query--it doesn't let you specify a batch of statements. If you want to access a database that isn't the default for your login, you'll want to use scoped naming for the table name in the SELECT statement. Instead of doing "USE Websites; SELECT * FROM WebsiteNames", you can use "SELECT * FROM Websites..WebsiteNames". This scopes the table name "WebsiteNames" to the default owner or schema in the database "Websites". No USE statement is necessary, then.
 
What I need to do is create an Excel file for every table in a particular database. A one-time request for a specific client.

The BCP approach was going to be generated and executed by a stored proc call that I found here:
http://weblogs.sqlteam.com/mladenp/archive/2006/07/25/10771.aspx

The stored proc would be encapsulated within a WHILE loop that used a cursor to iterate a temp table full of the looked-up table names, so that I could have a programmatic way to have a single XLS file generated for every table in a particular database.

There's about 100+ tables in this database that each need to be dumped to an Excel file (one file per table), and the BCP method was the first way that I found to do this.

Could you give a pointers or some links that would get me started down the SSIS route if that is the better approach for what I am trying?
 
Why not just use enterprise manager, export data? You can select XLS, and select which tables you want with the wizard. Super fast and easy, esp for one time jobs.
 
Enterprise Manager doesn't exist in SQL Server 2005. In 2005, the "Export Data" feature is replaced by the DTS Wizard, which is probably the easiest way to get SSIS to do the work for you.

For whatever reason, the Setup team never got an icon for the DTS Wizard into the Start menu. You can run it by executing "DTSWizard.EXE" from the Run box. Your data source will be SQL Server, and the table your want. Your data destination will be Excel, into a file you can create.
 
Sorry, SQL Server Server Manager. I used the name for what the 2000 version was called.
 
The DTSWizard.exe file may do exactly what I need, but I'm getting a few errors. They happen in the "Pre-execute" phase before the actual table reading and dumping would happen. Here is the error list being returned:
Code:
- Pre-execute (Error)
Messages
Error 0xc0202009: Data Flow Task: SSIS Error Code DTS_E_OLEDBERROR.  An OLE DB error has occurred. Error code: 0x80040E21.
 (SQL Server Import and Export Wizard)
 
Error 0xc0202025: Data Flow Task: Cannot create an OLE DB accessor. Verify that the column metadata is valid.
 (SQL Server Import and Export Wizard)
 
Error 0xc004701a: Data Flow Task: component "Destination 51 - categoryV" (3548) failed the pre-execute phase and returned error code 0xC0202025.
 (SQL Server Import and Export Wizard)

It looks like the first error is causing the second error to be thrown. I'm not sure about the third error, specifically what the "Destination 51" part implies for the table called "categoryV".
 
SSIS is asking OLEDB to create a connection to the Excel spreadsheet. That's failing. I can't tell why, from this distance. "Destination 51" is one of the destinations you're sinking data into (eg, Excel). It's just a name of the object in the SSIS mapping.
 
The Excel file has been interesting to troubleshoot. The DTS Wizard is counting on finding a fresh XLS file. After the DTS Wizard fails, I've opened the XLS file to see that new worksheets were made, one for every table to be dumped and each worksheet is named after the table name. Additionally, the first row in each worksheet now contains the column names from the table.

BTW, attempting to re-run the DTS Wizard to write to the same XLS file with those worksheets still in the XLS file will result in a different failure message (basically complaining that the worksheet name it is trying to create already exists).

Thoughts?
 
Oh, so you're doing a big batch and stuffing everything into one workbook? That might be the problem, as I'm sure that Excel has a limit on how many sheets can be in a single workbook.
 
Hmm.... I tried it with about 20 tables or so, and some quick Google searches said that the max worksheet count was 255.
 
Still tracing and troubleshooting this... So far have not been able to clear up the three error messages previously posted that came from DTSWizard.

Any ideas?
 
Sorry I can't be of more help; I don't have a database setup which would enable me to try and reproduce your results. You might try having the DTSWizard save off the package it generates, then run and debug it in the BI Workbench. That might reveal more clues, or even better error messages.

If that doesn't work out, you can have SSIS write to CSV files, instead, as you were doin' with your original plan.
 
I traced some more from the limitations of Excel (http://office.microsoft.com/en-us/excel/HP051992911033.aspx) and against the tables. Looks like I was exceeding the limit on row count and possibly the per-cell size limitation with a few of the tables.

I think this issue is closed, as I'll just dump the tables that won't exceed the XLS file limitations and let my client know what tables cannot be dumped to XLS. They can get a flatfile dump of the problem tables if necessary.

Thanks again for the help with this!!
 
I'm glad you've sorted it out, PTNL. I wouldn't have constructed sample data for myself that was so large, even if I had the time to try and repro the problem, so I Wouldn't have come across the same issue.

I also would have used Excel 2007 because that's what I've got installed. Notably, Excel 2007 has much higher limits, so maybe you'd like to use that to provide the spreadsheets your customer wants.
 
The client has Excel 2003. However, do you know if the Office 2007 Compatibility addon for Office 2003 would bypass the normal 2003 limitation? Say I have a 2007 worksheet with over 100 thousand rows, and the file was opened by Excel 2003 that had the 2007 Compatibility addon -- would the file still be viewable, or are you still held back by 2003's limitations?
 
I doubt it; I'd figure the limits are inherent in Excel's calculation engine and in-memory representation, not in the file formats.
 
Back
Top