Showing posts with label importing. Show all posts
Showing posts with label importing. Show all posts

Thursday, March 22, 2012

Bigint stored as varchar has issues...

Hi,
I'm importing contact information into SQL Server from Excel using
OPENROWSET. The issue I'm having is with how the phone numbers get stored.
The phone numbers have no extra characters like dashes, so they appear like
9495551212, which is equivelant to 949-555-1212. The phone number is being
imported to a varchar field, which implicitly converts it to a format like
7.70947e+009. The final destination field is intended to hold the data as it
is originally, so it's a bigint datatype.
My first thought was to use CAST or CONVERT. But I get:
Server: Msg 8114, Level 16, State 5, Line 1
Error converting data type varchar to bigint.
Any ideas?Try,
select str(cast('7.70947e+009' as float), 10, 0)
AMB
"Eric" wrote:

> Hi,
> I'm importing contact information into SQL Server from Excel using
> OPENROWSET. The issue I'm having is with how the phone numbers get stored.
> The phone numbers have no extra characters like dashes, so they appear lik
e
> 9495551212, which is equivelant to 949-555-1212. The phone number is being
> imported to a varchar field, which implicitly converts it to a format like
> 7.70947e+009. The final destination field is intended to hold the data as
it
> is originally, so it's a bigint datatype.
> My first thought was to use CAST or CONVERT. But I get:
> Server: Msg 8114, Level 16, State 5, Line 1
> Error converting data type varchar to bigint.
> Any ideas?|||Problem #1 7.70947e+009 is not a representation of an integer. It is
expressed with exponentials, so it is a float. You first have to cast it
to a float:
select cast(cast ('7.70947e+009' as float) as bigint)
Problem #2,( and this is the kicker) The result will be:
7709470000
Which is probably not a valid phone number. You are losing some of the
significant digits. Try casting it ot a bigint first, then to a
varhchar(10).
SELECT cast(cast(phonenumber as bigint) as varchar(10))
FROM OPENDATASOURCE('Microsoft.Jet.OLEDB.4.0',
'Data Source=c:\test.xls;Extended Properties=Excel 8.0')...sheet1$
This worked for me.
----
Louis Davidson - drsql@.hotmail.com
SQL Server MVP
Compass Technology Management - www.compass.net
Pro SQL Server 2000 Database Design -
http://www.apress.com/book/bookDisplay.html?bID=266
Note: Please reply to the newsgroups only unless you are interested in
consulting services. All other replies may be ignored :)
"Eric" <Eric@.discussions.microsoft.com> wrote in message
news:6BCA27F2-B212-4B4A-9E47-FAAEA48B7A14@.microsoft.com...
> Hi,
> I'm importing contact information into SQL Server from Excel using
> OPENROWSET. The issue I'm having is with how the phone numbers get stored.
> The phone numbers have no extra characters like dashes, so they appear
> like
> 9495551212, which is equivelant to 949-555-1212. The phone number is being
> imported to a varchar field, which implicitly converts it to a format like
> 7.70947e+009. The final destination field is intended to hold the data as
> it
> is originally, so it's a bigint datatype.
> My first thought was to use CAST or CONVERT. But I get:
> Server: Msg 8114, Level 16, State 5, Line 1
> Error converting data type varchar to bigint.
> Any ideas?|||That did the trick! Thank you so much...
"Louis Davidson" wrote:

> Problem #1 7.70947e+009 is not a representation of an integer. It is
> expressed with exponentials, so it is a float. You first have to cast it
> to a float:
> select cast(cast ('7.70947e+009' as float) as bigint)
> Problem #2,( and this is the kicker) The result will be:
> 7709470000
> Which is probably not a valid phone number. You are losing some of the
> significant digits. Try casting it ot a bigint first, then to a
> varhchar(10).
> SELECT cast(cast(phonenumber as bigint) as varchar(10))
> FROM OPENDATASOURCE('Microsoft.Jet.OLEDB.4.0',
> 'Data Source=c:\test.xls;Extended Properties=Excel 8.0')...sheet1$
> This worked for me.
>
> --
> ----
--
> Louis Davidson - drsql@.hotmail.com
> SQL Server MVP
> Compass Technology Management - www.compass.net
> Pro SQL Server 2000 Database Design -
> http://www.apress.com/book/bookDisplay.html?bID=266
> Note: Please reply to the newsgroups only unless you are interested in
> consulting services. All other replies may be ignored :)
> "Eric" <Eric@.discussions.microsoft.com> wrote in message
> news:6BCA27F2-B212-4B4A-9E47-FAAEA48B7A14@.microsoft.com...
>
>|||You could try adding IMEX=1 to the connection properties to see if that
helps.
select *
from OpenRowSet(
'Microsoft.Jet.OLEDB.4.0',
'Excel 8.0;Database=C:\filename.xls;HDR=YES;IMEX=1;'
,Sheet1$
)
"Eric" <Eric@.discussions.microsoft.com> wrote in message
news:6BCA27F2-B212-4B4A-9E47-FAAEA48B7A14@.microsoft.com...
> Hi,
> I'm importing contact information into SQL Server from Excel using
> OPENROWSET. The issue I'm having is with how the phone numbers get stored.
> The phone numbers have no extra characters like dashes, so they appear
> like
> 9495551212, which is equivelant to 949-555-1212. The phone number is being
> imported to a varchar field, which implicitly converts it to a format like
> 7.70947e+009. The final destination field is intended to hold the data as
> it
> is originally, so it's a bigint datatype.
> My first thought was to use CAST or CONVERT. But I get:
> Server: Msg 8114, Level 16, State 5, Line 1
> Error converting data type varchar to bigint.
> Any ideas?|||Hi
Make sure that on the Excel spreadsheet, the column is not "general" but
rather a "text" type cell.
General cells are evaluated by the Excel driver and if they meet numeric
criteria, they are passed as numeric to SQL Server. This is an Excel issue.
Regards
--
Mike Epprecht, Microsoft SQL Server MVP
Zurich, Switzerland
IM: mike@.epprecht.net
MVP Program: http://www.microsoft.com/mvp
Blog: http://www.msmvps.com/epprecht/
"Eric" <Eric@.discussions.microsoft.com> wrote in message
news:6BCA27F2-B212-4B4A-9E47-FAAEA48B7A14@.microsoft.com...
> Hi,
> I'm importing contact information into SQL Server from Excel using
> OPENROWSET. The issue I'm having is with how the phone numbers get stored.
> The phone numbers have no extra characters like dashes, so they appear
like
> 9495551212, which is equivelant to 949-555-1212. The phone number is being
> imported to a varchar field, which implicitly converts it to a format like
> 7.70947e+009. The final destination field is intended to hold the data as
it
> is originally, so it's a bigint datatype.
> My first thought was to use CAST or CONVERT. But I get:
> Server: Msg 8114, Level 16, State 5, Line 1
> Error converting data type varchar to bigint.
> Any ideas?

Thursday, March 8, 2012

BI Accelerator 1.1 and Bulk Insert

Dear Anyone,

I had been using BI Accelerator 1.1 for 2 years now. In our current project, I encountered the following error during the importing phase of BI.

I traced the problem to the bulk insert statement that BI Accelerator is executing which is ...

BULK INSERT [Project88_Staging]..[S_Fact_BrdcastAlert] FROM 'C:\Program Files\Microsoft SQL Server Accelerator for BI\data\Project88\Import\S_Fact_BrdcastAlert.txt' WITH ( FORMATFILE = 'C:\Program Files\Microsoft SQL Server Accelerator for BI\data\Project88\DTS\Import\Format\S_Fact_Brdcast Alert.fmt' , DATAFILETYPE = 'widechar', BATCHSIZE = 100000 , MAXERRORS = 100 )

The error that it generates is as follows:

Bulk Insert: DataFileType was incorrectly specified as widechar. DataFileType will be assumed to be char because the data file does not have a Unicode signature.

We export data to text files that BI Accelerator picks up. All other files imports properly imported without problems except for one fact table text file. We use the same format file that BI Accelerator uses to export to file the data to make sure that there would be no incompatabilities. File size is about 2.3GB++, which is a standard size with our other fact table files that doesnt have any errors.

We also checked for data error which we found none. We also checked the txt file we generate, notepad confirms that it is a unicode file.

Can anyone please shed a light in what is happening. Is this a bug? As much as possible I dont want to place a workaround on this one since our entire framework for loading data is all the same and has already been set up. Any help would be apreciated.

Thank YouAny reply would be appreciated.

Joseph Ollero
jollero@.wizardsgroup.com

Thursday, February 16, 2012

Best way to mirror data?

I've got two SQL servers (sql server 2000 version 8.0) (dev) and (live).
I'm constantly exporting/importing across from one to the other using the
export/import tool within SQL Enterprise Manager. I'm consistently having
to update "default values" etc. as they are wiped clean when I
import/export.
I could do a script dump in SQL Query Analyzer for each table, but I was
figuring there had to be a better way to do this...?How about a job to do a backup on one, copy it across the network, then
restore on server 2? You also have the copy database wizard in DTS although
I've never used it
Ray Higdon MCSE, MCDBA, CCNA
--
"Jesse Bilsten" <jesse@.vreo.com> wrote in message
news:OaExfP%23$DHA.688@.tk2msftngp13.phx.gbl...
> I've got two SQL servers (sql server 2000 version 8.0) (dev) and (live).
> I'm constantly exporting/importing across from one to the other using the
> export/import tool within SQL Enterprise Manager. I'm consistently having
> to update "default values" etc. as they are wiped clean when I
> import/export.
> I could do a script dump in SQL Query Analyzer for each table, but I was
> figuring there had to be a better way to do this...?
>
>|||You can try SQL Compare from Red-Gate. It isn't perfect, but it is better
than most 'roll your own' systems that I have seen, including my own.
www.red-gate.com
Geoff N. Hiten
Microsoft SQL Server MVP
Senior Database Administrator
Careerbuilder.com
I support the Professional Association for SQL Server
www.sqlpass.org
"Jesse Bilsten" <jesse@.vreo.com> wrote in message
news:OaExfP%23$DHA.688@.tk2msftngp13.phx.gbl...
> I've got two SQL servers (sql server 2000 version 8.0) (dev) and (live).
> I'm constantly exporting/importing across from one to the other using the
> export/import tool within SQL Enterprise Manager. I'm consistently having
> to update "default values" etc. as they are wiped clean when I
> import/export.
> I could do a script dump in SQL Query Analyzer for each table, but I was
> figuring there had to be a better way to do this...?
>
>|||Check out DB Ghost from Innovartis at http://www.dbghost.com
Darren Fuller
SQL Server DBA MCSE
*** Sent via Developersdex http://www.examnotes.net ***
Don't just participate in USENET...get rewarded for it!

Best way to mirror data?

I've got two SQL servers (sql server 2000 version 8.0) (dev) and (live).
I'm constantly exporting/importing across from one to the other using the
export/import tool within SQL Enterprise Manager. I'm consistently having
to update "default values" etc. as they are wiped clean when I
import/export.
I could do a script dump in SQL Query Analyzer for each table, but I was
figuring there had to be a better way to do this...?How about a job to do a backup on one, copy it across the network, then
restore on server 2? You also have the copy database wizard in DTS although
I've never used it
--
Ray Higdon MCSE, MCDBA, CCNA
--
"Jesse Bilsten" <jesse@.vreo.com> wrote in message
news:OaExfP%23$DHA.688@.tk2msftngp13.phx.gbl...
> I've got two SQL servers (sql server 2000 version 8.0) (dev) and (live).
> I'm constantly exporting/importing across from one to the other using the
> export/import tool within SQL Enterprise Manager. I'm consistently having
> to update "default values" etc. as they are wiped clean when I
> import/export.
> I could do a script dump in SQL Query Analyzer for each table, but I was
> figuring there had to be a better way to do this...?
>
>|||You can try SQL Compare from Red-Gate. It isn't perfect, but it is better
than most 'roll your own' systems that I have seen, including my own.
www.red-gate.com
--
Geoff N. Hiten
Microsoft SQL Server MVP
Senior Database Administrator
Careerbuilder.com
I support the Professional Association for SQL Server
www.sqlpass.org
"Jesse Bilsten" <jesse@.vreo.com> wrote in message
news:OaExfP%23$DHA.688@.tk2msftngp13.phx.gbl...
> I've got two SQL servers (sql server 2000 version 8.0) (dev) and (live).
> I'm constantly exporting/importing across from one to the other using the
> export/import tool within SQL Enterprise Manager. I'm consistently having
> to update "default values" etc. as they are wiped clean when I
> import/export.
> I could do a script dump in SQL Query Analyzer for each table, but I was
> figuring there had to be a better way to do this...?
>
>|||You can easily create a job in SQL Server Agent.
>--Original Message--
>I've got two SQL servers (sql server 2000 version 8.0)
(dev) and (live).
>I'm constantly exporting/importing across from one to the
other using the
>export/import tool within SQL Enterprise Manager. I'm
consistently having
>to update "default values" etc. as they are wiped clean
when I
>import/export.
>I could do a script dump in SQL Query Analyzer for each
table, but I was
>figuring there had to be a better way to do this...?
>
>.
>

Friday, February 10, 2012

Best way of importing Excel to SQL Server

Hello,

I nee to write something that will transfer excel data into an SQL Server
table.

I have for another database application I wrote have it importing Excel
spreadsheet data using cell by cell, row by row method. This is fully
automated so the user can choose whatever spreadsheet they want to import
and press a button which sits on a VB6 frontend.
This has been good for that situsation but it can be very slow when there
are large amounts of data to process.

I am just wondering are there any faster, better alternatives that would
still enable a user to select which excel spreadsheet to import
as the application I am writing now will sit on a website frontend, using
ASP, and I'd really like to try and speed things up if I could.

any advice would be much appreciated.

Thankyou,

Oh, and hello, this is my first post here!

JayneLittle PussyCat wrote:
> Hello,
> I nee to write something that will transfer excel data into an SQL Server
> table.
> I have for another database application I wrote have it importing Excel
> spreadsheet data using cell by cell, row by row method. This is fully
> automated so the user can choose whatever spreadsheet they want to import
> and press a button which sits on a VB6 frontend.
> This has been good for that situsation but it can be very slow when there
> are large amounts of data to process.
> I am just wondering are there any faster, better alternatives that would
> still enable a user to select which excel spreadsheet to import
> as the application I am writing now will sit on a website frontend, using
> ASP, and I'd really like to try and speed things up if I could.
> any advice would be much appreciated.
> Thankyou,
> Oh, and hello, this is my first post here!
> Jayne

DTS Packages are design just for this sort of thing.

Zach|||nib <individual_news@.nibsworld.com> wrote in message news:<2tqnibF21lmapU1@.uni-berlin.de>...
> Little PussyCat wrote:
> > Hello,
> > I nee to write something that will transfer excel data into an SQL Server
> > table.
> > I have for another database application I wrote have it importing Excel
> > spreadsheet data using cell by cell, row by row method. This is fully
> > automated so the user can choose whatever spreadsheet they want to import
> > and press a button which sits on a VB6 frontend.
> > This has been good for that situsation but it can be very slow when there
> > are large amounts of data to process.
> > I am just wondering are there any faster, better alternatives that would
> > still enable a user to select which excel spreadsheet to import
> > as the application I am writing now will sit on a website frontend, using
> > ASP, and I'd really like to try and speed things up if I could.
> > any advice would be much appreciated.
> > Thankyou,
> > Oh, and hello, this is my first post here!
> > Jayne
> DTS Packages are design just for this sort of thing.
> Zach

I second that notion -- however, there are some problems with
importing Excel data into SQL Server.

I've noticed that if you have a column whose first value is numeric,
and then there are other values in that column that are not numeric,
those values get dropped. Kind of a pain.|||>>
>> DTS Packages are design just for this sort of thing.
>>
>> Zach
>
> I second that notion -- however, there are some problems with
> importing Excel data into SQL Server.

Will a DTX package still be as flexible as what I am doing now,
i.e. would a user still be able to browse through a VB6 / webfrontend to the
spreadsheet they want to import,
i.e. can it be coded in VB6 / ASP?
> I've noticed that if you have a column whose first value is numeric,
> and then there are other values in that column that are not numeric,
> those values get dropped. Kind of a pain.

There are loads of problems with importing excel into sql server, I've found
if the spreadsheet is not set up as a text format data then strange things
will happen.
Similar when you export from sql server to excel, leading zeros of numeric
data gets dropped etc...
Anyway,
Enough of that, thanks

Jayne|||Little PussyCat wrote:
>>>DTS Packages are design just for this sort of thing.
>>>
>>>Zach
>>
>>
>>I second that notion -- however, there are some problems with
>>importing Excel data into SQL Server.
>
> Will a DTX package still be as flexible as what I am doing now,
> i.e. would a user still be able to browse through a VB6 / webfrontend to the
> spreadsheet they want to import,
> i.e. can it be coded in VB6 / ASP?

You'd have to investigate exactly how much you can do with it. Even if
DTS requires a hard-coded file name, you could always have your front
end copy the chosen spreadsheet to said name and then run the dts package.

Zach

BEST UDF to handle char or varchar to smalldatatime converstion pr

Hi Experts,
I am importing data from text files, data is about 800GB, yeah thats right.
in different file and in different varchar format. like 01/01/2004 jan 17,
200, mm/dd/yy. it had bad data and good data, but there are millions and
millions records, and it has all kind of problems any body can think of.
is there any site or way to get any function which handle these problems,
or if any of you ever had chance to see this kinda function and pass to me.
thanking you in advace for your help..
-PermoodYou can use ISDATE to determine if the specified string can be converted to
a SQL Server datetime datatype: Examples:
SELECT ISDATE('1/01/2004')
SELECT ISDATE('jan 17, 200')
SELECT ISDATE('mm/dd/yy')
SELECT ISDATE('01/01/04')
You might consider performing data type validation and conversion as part of
your ETL process before importing data into SQL Server. This will likely
perform better than using an intermediate SQL Server staging table when you
have a lot of bad data and large volumes.
--
Hope this helps.
Dan Guzman
SQL Server MVP
"-Permood" <Permood@.discussions.microsoft.com> wrote in message
news:FD3FF986-932F-4FA0-9AD1-E027327AA100@.microsoft.com...
> Hi Experts,
> I am importing data from text files, data is about 800GB, yeah thats
> right.
> in different file and in different varchar format. like 01/01/2004 jan 17,
> 200, mm/dd/yy. it had bad data and good data, but there are millions and
> millions records, and it has all kind of problems any body can think of.
> is there any site or way to get any function which handle these problems,
>
> or if any of you ever had chance to see this kinda function and pass to
> me.
>
> thanking you in advace for your help..
> -Permood
>|||Thanks Dan,
here is problem, i am very new in my company, some 'DEVELOPERS' don't want
to change their way of doing and they decided to use sql staging, and
transformationg. I dont' have much experience with SQL Server either. So I
wanted to see if some thing is already there, i means any function, which and
handle most date problems and then i and enhance it according to my needs.
but thank you for your time and suggestions.
"Dan Guzman" wrote:
> You can use ISDATE to determine if the specified string can be converted to
> a SQL Server datetime datatype: Examples:
> SELECT ISDATE('1/01/2004')
> SELECT ISDATE('jan 17, 200')
> SELECT ISDATE('mm/dd/yy')
> SELECT ISDATE('01/01/04')
> You might consider performing data type validation and conversion as part of
> your ETL process before importing data into SQL Server. This will likely
> perform better than using an intermediate SQL Server staging table when you
> have a lot of bad data and large volumes.
> --
> Hope this helps.
> Dan Guzman
> SQL Server MVP
> "-Permood" <Permood@.discussions.microsoft.com> wrote in message
> news:FD3FF986-932F-4FA0-9AD1-E027327AA100@.microsoft.com...
> > Hi Experts,
> >
> > I am importing data from text files, data is about 800GB, yeah thats
> > right.
> > in different file and in different varchar format. like 01/01/2004 jan 17,
> > 200, mm/dd/yy. it had bad data and good data, but there are millions and
> > millions records, and it has all kind of problems any body can think of.
> >
> > is there any site or way to get any function which handle these problems,
> >
> >
> > or if any of you ever had chance to see this kinda function and pass to
> > me.
> >
> >
> > thanking you in advace for your help..
> > -Permood
> >
>
>

BEST UDF to handle char or varchar to smalldatatime converstion pr

Hi Experts,
I am importing data from text files, data is about 800GB, yeah thats right.
in different file and in different varchar format. like 01/01/2004 jan 17,
200, mm/dd/yy. it had bad data and good data, but there are millions and
millions records, and it has all kind of problems any body can think of.
is there any site or way to get any function which handle these problems,
or if any of you ever had chance to see this kinda function and pass to me.
thanking you in advace for your help..
-Permood
You can use ISDATE to determine if the specified string can be converted to
a SQL Server datetime datatype: Examples:
SELECT ISDATE('1/01/2004')
SELECT ISDATE('jan 17, 200')
SELECT ISDATE('mm/dd/yy')
SELECT ISDATE('01/01/04')
You might consider performing data type validation and conversion as part of
your ETL process before importing data into SQL Server. This will likely
perform better than using an intermediate SQL Server staging table when you
have a lot of bad data and large volumes.
Hope this helps.
Dan Guzman
SQL Server MVP
"-Permood" <Permood@.discussions.microsoft.com> wrote in message
news:FD3FF986-932F-4FA0-9AD1-E027327AA100@.microsoft.com...
> Hi Experts,
> I am importing data from text files, data is about 800GB, yeah thats
> right.
> in different file and in different varchar format. like 01/01/2004 jan 17,
> 200, mm/dd/yy. it had bad data and good data, but there are millions and
> millions records, and it has all kind of problems any body can think of.
> is there any site or way to get any function which handle these problems,
>
> or if any of you ever had chance to see this kinda function and pass to
> me.
>
> thanking you in advace for your help..
> -Permood
>