Training
Certifications
Books
Special Offers
Community




 
MCSE Training Kit (Exam 70-228): Microsoft® SQL Server™ 2000 System Administration
Author Microsoft Corporation
Pages 720
Disk 1 Companion CD(s)
Level All Levels
Published 03/28/2001
ISBN 9780735612471
ISBN-10 0-7356-1247-1
Price(USD) $59.99
To see this book's discounted price, select a reseller below.
 

More Information

About the Book
Table of Contents
Sample Chapter
Index
Related Series
Related Books
About the Author

Support: Book & CD

Rate this book
Barnes Noble Amazon Quantum Books

 


Chapter 7: Populating a Database continued


Lesson 5: Using the Bulk Copy Program (Bcp) and the BULK INSERT Transact-SQL Statement

The BULK INSERT Transact-SQL statement and the Bcp command-prompt utility are used to import data into SQL Server 2000 from a text file. These methods are designed to efficiently transfer large amounts of data. In this lesson you will learn how to use each of these commands.


After this lesson, you will be able to

  • Describe how SQL Server 2000 copies data in bulk using text files
  • Describe how to optimize bulk copy operations
  • Use the Bcp command-prompt utility to import and export data
  • Use the BULK INSERT Transact-SQL statement to import data

Estimated lesson time: 30 minutes


Copying Data in Bulk Using Text Files

In addition to using DTS, SQL Server 2000 can import data from text files using the BULK INSERT Transact-SQL statement or using the Bcp command-prompt utility. Each is designed to import large quantities of data with minimal transformation during the process at very high speed. The BULK INSERT statement is generally faster. The Bcp program has been used for many years (DTS is a recently introduced tool), and veteran database administrators frequently have generated numerous scripts that they use to import data. For this reason, you need to understand Bcp. SQL Server 2000 supports all of those existing Bcp scripts. However, if you need to create new scripts, DTS is much easier to use. It has identical performance because the BULK INSERT statement has been encapsulated in the Bulk Insert task and the graphical interface painlessly walks you through the formatting issues.

The text files are generally tab-delimited files (but other delimiters can also be used, such as commas). You must specify the format of this file during the import process. You can specify the format as part of the Bcp command or BULK INSERT statement. You can also specify it in a reusable format file. The text files you will use to import from are frequently generated by earlier versions of SQL Server, by other database programs, or by spreadsheets. The data in the text file can be stored in character mode or in binary mode (for SQL Server to SQL Server transfers).


NOTE:
Bcp and the BULK INSERT Transact-SQL statement can use the same format file.

Using Bcp

When using Bcp to import data from a data file, you must specify each parameter. The Bcp syntax is quite cryptic and must be entered precisely or the command will fail. Table 7.7 lists the more commonly used parameters. Bcp parameters are case-sensitive.

The following example imports data from the NewData text file to the NewData table on New Database with a column delimiter of a comma, a row delimiter of a new line, using character data, with a batch size of 250, an error size of 50, using the NewData.err error file, connecting using a trusted connection, and specifying a TABLOCK hint:

bcp NewDatabase..NewData in C:\SelfPacedSQL\CH_7\NewData.inc –c
–t"," –r\n /e C:\SelfPacedSQL\CH_7\NewData.err –b250 –m50 –T –h "TABLOCK"


NOTE:
You must enter the code in the preceding example as a single line without a line break.

Table 7.7  Commonly Used Parameters for Bcp

ArgumentDescription
Database_nameThe database into which the data is being inserted. If not specified, the default database for the specified user is used.
Table_nameThe name of the table into which the data is being inserted or from which the data is copied.
"Query"The query used to filter the data being copied out of SQL Server.
In | OutThe direction of the bulk copy operation.
FormatUsed to create a format file.
Data_fileThe data file used as the source or the destination of the bulk copy operation.
-mThe maximum number of errors that can occur before the bulk copy operation is cancelled. Default is 10.
-fSpecifies the full path of the format file. This parameter is optional.
-eSpecifies the full path of the error file used to record all rows Bcp is unable to transfer to the database. If this option is not used, no error file is created.
-bSpecifies the number of rows per batch of data copied. Each batch is copied to the SQL Server 2000 instance as a single transaction.
-cSpecifies the bulk copy operation using a character data type.
-tSpecifies the field terminator. The default is tab.
-rSpecifies the row terminator. The default is new line.
-S server_name Specifies the server name (and instance name if applicable) to which
[/instance_name]Bcp will connect. Default instance on the local server is the default.
-USpecifies the login ID.
-PSpecifies the password for the login ID. NULL is the default.
-TSpecifies the use of a trusted connection, using the security credentials of the current user.
-h "hint"Hints such as TABLOCK, ROWS_PER_BATCH=nn and ORDER ASC | DESC. These hints tell SQL Server how to process the imported data most efficiently.

Practice: Importing Data Using Bcp

In this practice you import data to SQL Server 2000 using the Bcp command-prompt utility.

To import data using Bcp

  1. Ensure that you are logged on to the SelfPacedSQL.MSFT domain server as Administrator.
  2. Click Start, point to Programs, point to Microsoft SQL Server, and then click Query Analyzer.
  3. SQL Server Analyzer appears displaying the the Connect To SQL Server dialog box.

  4. In the Connect To SQL Server dialog box, select SelfPacedSQL from the SQL Server combo box, and use Windows authentication.
  5. Click OK.
  6. On the File menu, click Open.
  7. The Open Query File dialog box appears.

  8. Select NewData.sql in the C:\SelfPacedSQL\CH_7 folder.
  9. A CREATE TABLE statement appears that will create a table called NewData into which you will import data using the Bcp command and bulk copy data.

  10. Click the Execute Query button on the toolbar to execute the NewData.sql statement.
  11. Click Start, point to Programs, point to Accessories, and then click Command Prompt.
  12. A Command Prompt window appears.

  13. Type bcp and press Enter.
  14. Notice the display of available arguments.

  15. Type the following command on a single line:
  16. bcp NorthwindReportData..NewData in
    C:\SelfPacedSQL\CH_7\NewData.inc –c –t"," –r\n /e
    C:\SelfPacedSQL\CH_7\NewData.err –b250 –m50 –T –h "TABLOCK"


    NOTE:
    You must enter the code in the preceding example as a single line without a line break.

  17. Press Enter.
  18. Notice that 1343 rows are copied in batches of 250 rows. The packet size used and the elapsed clock time are also displayed. An empty NewData.err file now appears in the C:\SelfPacedSQL\CH_7 folder.

  19. Close the Command Prompt window.
  20. In SQL Query Analyzer, click the Clear Window button on the toolbar.
  21. In the Query pane, type SELECT * FROM NewData.
  22. Click the Execute Query button on the toolbar.
  23. Notice that 1343 rows are displayed from the NewData table.

  24. Close SQL Query Analyzer.

Using the BULK INSERT Transact-SQL Statement

The BULK INSERT Transact-SQL statement requires information similar to that required by the Bcp command to perform a bulk copy operation. The following example uses identical information to that used in the preceding Bcp example, except using Transact-SQL.

BULK INSERT NewDatabase..NewData
FROM 'C:\SelfPacedSQL\CH_7\NewData.inc'
WITH
(
BATCHSIZE = 250 ,
DATAFILETYPE = 'char' ,
FIELDTERMINATOR = ',' ,
ROWTERMINATOR = '\n',
MAXERRORS = 50 ,
TABLOCK )

Optimizing Bulk Copy Operations

When you perform a bulk copy operation that imports data into SQL Server 2000, the recovery mode of the destination database affects performance. If the database receiving the data is set for full recovery, all row insertions are fully logged. This generates a substantial number of log records during a large bulk copy operation, which can fill up the transaction log and negatively affect performance. For optimum performance during a bulk copy operation, setting the database to bulk-logged recovery is recommended. When you use bulk-logged recovery, the transaction log captures the results of bulk operations, but does not capture row-by-row insertions.

In addition, if you are loading a large amount of data from a single client into an empty table, you should specify the TABLOCK hint (rather than having SQL Server 2000 start with row locks and escalate them to table locks) and use a large batch size using the ROWS_PER_BATCH hint (large batch sizes are generally more efficient than small batch sizes). If the table into which you are importing the data has indexes, it is generally faster to drop all nonclustered indexes and re-create them after the data insertion. If the table has clustered indexes, it is generally faster to order the data in the text file to match the order in the clustered index and specify the ORDER hint.

If the table into which you are importing data is not empty and contains indexes, the decision on whether to drop indexes depends upon the amount of data being inserted compared to the amount of data existing in the table. The greater the percentage of new data, the faster it is to drop all indexes and re-create them after the data is loaded.

Lesson Summary

In addition to DTS, SQL Server 2000 provides the command-prompt utility Bcp and the Transact-SQL BULK INSERT statement for importing data from text files. These methods are particularly suited to high-speed insertions of data into a database. Bcp has been used for years as the only method for inserting large amounts of data into SQL Server, and many existing scripts will be in use for many more years. Database administrators must be familiar with Bcp to work with existing scripts (even if the database administrator is not creating any new scripts). The BULK INSERT statement can be used from within SQL Query Analyzer, and it is encapsulated within the DTS Bulk Insert task. There are a number of ways to optimize the speed of insertions of large amounts of data using these methods, including using bulk-logged recovery and dropping nonclustered indexes.


Previous   |  Table of Contents   |   Next




Top of Page


Last Updated: Saturday, July 7, 2001