Import CSV file into SQL Server using T-SQL query 3


Sometimes, we need to read an external CSV file using T-SQL query in SQL Server. Due to some functional limitations, we cannot use the import-export wizard functionality in such kinds of scenarios as we need the result set in the middle of the execution of the other queries. There, we can use the BULK INSERT SQL command which helps us to import a data file into SQL Server table directly.

Let’s have a look at the sample CSV file which we want to import into a SQL table. The CSV file is this.

Sample CSV File

Sample CSV File

To download the sample CSV file, click here. The above CSV file uses comma as a column delimiter and contains 6 columns which are:

PersonID – Stores the Id of the person.

FullName – Stores the full name of the person.

PreferredName – Stores the preferred name of the person.

SearchName – Stores the search name of the person.

IsPermittedToLogon – Stores that whether a person is allowed to log in or not.

LogonName – Stores the login name of the person.

Now, we need to read this file in a SQL table. So, let’s create the table structure using this script.

IF OBJECT_ID('dbo.SampleCSVTable') IS NOT NULL
	DROP TABLE dbo.SampleCSVTable
GO

CREATE TABLE dbo.SampleCSVTable
(
	PersonID INT,
	FullName VARCHAR(512),
	PreferredName VARCHAR(512),
	SearchName VARCHAR(512),
	IsPermittedToLogon BIT,
	LogonName VARCHAR(512)
)
GO

Now that we have created the table, we need to use the BULK INSERT command to import the CSV data into the table “SampleCSVTable“.

BULK INSERT dbo.SampleCSVTable
FROM 'C:\Sample CSV File.txt'
WITH
(
	FIELDTERMINATOR = ',',
	ROWTERMINATOR = '\n'
)
GO

The BULK INSERT command exposes several arguments which can be used while reading a CSV file. These are the arguments.

FORMAT = ‘CSV’
FIELDQUOTE = ‘quote_characters’
FORMATFILE = ‘format_file_path’
FIELDTERMINATOR = ‘field_terminator’
ROWTERMINATOR = ‘row_terminator’

We can use these arguments as per the need. However, keep a note that the FORMAT,  FIELDQUOTE, and FORMATFILE arguments apply to SQL Server 2017 only.

Let’s have a look at the table data which we have imported from the CSV file.

SELECT * FROM dbo.SampleCSVTable
GO
Sample table data

Sample table data

Thanks for the reading. Please share your inputs in the comments.

Rate This
[Total: 2    Average: 3.5/5]

Gopal Krishna Ranjan

About Gopal Krishna Ranjan

Gopal has 8 years of industry experience in Software development. He has a head down experience in Data Science, Database, Data Warehouse, Big Data and cloud technologies and has implemented end to end solutions. He has extensively worked on SQL Server, Python, Hadoop, Hive, Spark, Azure, Machine Learning, and MSBI (SSAS, SSIS, and SSRS). He also has good experience in windows and web application development using ASP.Net and C#.

Leave a comment

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

3 thoughts on “Import CSV file into SQL Server using T-SQL query

  • Jeff Bennett

    Hi,
    This is a good article for basic lists. Often, a DBA will be called upon to import more complex data from MS Excel, including free-form text fields. The biggest surprise comes from trying to pull in data that has special characters like Line-Feed (CHAR(13) + CHAR(10)) characters. It is just as messy to pull this type of data from SQL server and paste into MS Excel. For those times, you may want to include some information about string manipulation functions that allow you to replace special characters during your transfer.

    Another gotcha with some English, Scottish, and American names are the inclusion of apostrophe’s (‘) in last names, and even some first names. You need to take these into account when bulk loading data if you have any sort of T-SQL involved, or the single-tick will cause issues. Also,please be aware that an apostrophe is not always the same as a single-tick. You usually find this out when copying SQL code from the web into your query analyzer, but it can make for a lot of editing.

    Thanks
    Jeff Bennett
    St. Louis, MO