Tuesday, March 27, 2012
bind variables / parameter queries
I'm writing an Access pass-through query against a SQL server backend and I need some advice on passing parameters. Currently I use vba to substitute the literal values for the parameters prior to passing the query to SQL Server. However, I am going through a loop thousands of times with different literals for these parameters which causes the server's cache to fill up. In Oracle, there is a way to use bind variables for the parameters so that only one copy of the query is cached.
Does anyone know how I can do this in SQL Server?
For instance, I have 20,000 employees and I'm pulling info by SS#:
Select * from EmpTable where SS_number = [SSN]
Is there a way I can pass this query to SQL Server and then pass the value of [SSN] as I loop through the dataset?
Thanks.write a stored procedure, and instead of calling the database engine 20,000 times, just call it once and pass it a list of 20,000 numbers
come to think of it, where would you get 20,000 numbers? sounds like you might want to look for a JOIN solution|||SQL Server actually goes you one better, in that its ODBC drivers will automagically parameterize a query for you (unless you get really creative in modifying the query).
As Rudy pointed out though, if you have more than 20 iterations from a given client, you really ought to be thinking about a JOIN based solution... Doing that kind of thing on that scale one row at a time is WAY too much work for me!
-PatP|||Thanks, guys. I'll get write access to the backend and write a stored proc.
I'll have to read up on how to pass values to the proc (I'm guessing it's like a function).
Thanks again.|||How about this?:
Dim cnn As ADODB.Connection
Set cnn = New ADODB.Connection
cnn.Open "DSN=PKRebate2001", "sa", ""
Dim cmd As ADODB.Command
Set cmd = New ADODB.Command
With cmd
.ActiveConnection = cnn
.CommandText = "sp_UpdateCustomerUnique"
.CommandType = adCmdStoredProc
.Parameters.Refresh
.Parameters("@.ImportMonth").Value = IM
.Execute
End With
Set cmd = Nothing
Set cnn = Nothing|||If you are looking to get automatic parameterization, that VBA code is conceptually good.
-PatP
Tuesday, March 20, 2012
Big table
I am writing a small dating site and now I am designing the DB.
At some point I realized that I will need to create a table with more
than 12 columns and I felt that it is not such a good idea.
I've tried normalizing the database, but It was with no effect.
So let's sat that I have the following table:
CREATE TABLE users
(
id int CONSTRAINT pk_user_id PRIMARY KEY,
username varchar(100),
password varchar(100),
isAccVerificated bit,
verificationNumber int,
f_name nvarchar(250),
l_name nvarchar(250),
e-mail varchar(250),
age int,
location l_name nvarchar(250),
visitors int,
isMale bit
picture_data binary
)
"picture_data" contain the real picture in byre array.
So my question is what is preferably to do.
I believe this is not such a good idea to leave this table like this.
Thanks in advance.12 columns is a perfectly reasonable number if you need them. I don't
think most people would call that large. Your defined sizes of those
columns are a different matter. You surely don't need 250 characters
each for first and last name! It's wise to choose sensible maximum
sizes for character columns - if you put the space in there, someone
will use it.
Going by the column names it doesn't look like you have any obvious
transitive dependencies but you do appear to have some other problems:
ID is the only key, which it shouldn't be if it's an artificial key.
How about enforcing the constraint that the username or email address
be unique and non-NULL? Usually that makes sense for sites that require
registration.
All your columns except ID are nullable! Do you really want people to
register anonymously? What's the point of a row with only an ID but no
other information.
isMale? Why not call this Gender or Sex and use the ISO codes for Sex,
which are 1=Male and 2=Female?
Store date of birth rather than age. Otherwise you have the problem of
keeping the age up to date.
Don't store passwords in the database; store secure hashes instead.
Password recovery mechanisms are insecure and bad practice, even where
high levels of security aren't a requirement (people tend to re-use
their passwords so you could end-up exposing the password to someone's
E-Banking account!)
--
David Portas
SQL Server MVP
--|||John,
Along with what David said, I'd avoid storing the pictures in the database.
For one thing, surely you want the members to be able to post multiple
pictures, and this design won't allow for that. Store them in the file
system instead.
Take a look at:
http://www.aspfaq.com/show.asp?id=2149
-Andy
<John.Arthur@.gmail.com> wrote in message
news:1113326413.314223.31090@.o13g2000cwo.googlegro ups.com...
> Hi,
> I am writing a small dating site and now I am designing the DB.
> At some point I realized that I will need to create a table with more
> than 12 columns and I felt that it is not such a good idea.
> I've tried normalizing the database, but It was with no effect.
> So let's sat that I have the following table:
> CREATE TABLE users
> (
> id int CONSTRAINT pk_user_id PRIMARY KEY,
> username varchar(100),
> password varchar(100),
> isAccVerificated bit,
> verificationNumber int,
> f_name nvarchar(250),
> l_name nvarchar(250),
> e-mail varchar(250),
> age int,
> location l_name nvarchar(250),
> visitors int,
> isMale bit
> picture_data binary
> )
> "picture_data" contain the real picture in byre array.
> So my question is what is preferably to do.
> I believe this is not such a good idea to leave this table like this.
> Thanks in advance.|||You can easily normalize the pictures into another table so that
multiple pictures could be
stored under the same account. Personally, I like that approach better
than cluttering the
file system.
-- Dave|||True. To each his own... :)
"Dave" <dafergu2@.hotmail.com> wrote in message
news:1113329911.088937.55740@.l41g2000cwc.googlegro ups.com...
> You can easily normalize the pictures into another table so that
> multiple pictures could be
> stored under the same account. Personally, I like that approach better
> than cluttering the
> file system.
> -- Dave
Friday, February 24, 2012
Best XML structure for import into SQL 2005
I am writing a contingency plan for incase our SQL Server (2005) goes down. Basically when our applicatino is about to write data to the SQL Server it checks the connection. If the connection passes it outputs to SQL, if it fails it generates an XML document detailing the data so that the data can be imported later.
Each XML document needs to define which database and which table needs to be updated/inserted.
So the SQL needs to be able to read a number of XML documents and update the relevant tables according dependent on the data held within the XML; some examples:
<Output>
<Record>
<Database>DW</Database>
<Table>dbo.tblActivity</Table>
<DateTime>11-05-2007 10:03:33.099</DateTime>
<Data>
<strActivityPK>9353|HCR008</strActivityPK>
<strActivityCode>HCR008</strActivityCode>
<strActivityDescription>HCR Complete</strActivityDescription>
<datTarget>2007-05-18</datTarget>
<datActual />
<intMatterIntCode>9353</intMatterIntCode>
<intRecurringActivityFlag>0</intRecurringActivityFlag>
<intETLFlag>0</intETLFlag>
</Data>
</Record>
</Output>
I need SSIS to open this XML, read the Table element and insert or update dbo.tblActivity (in this case) with the data defined in the Data element. If updating, the update should only take place if the DateTime is more recent than a field (datLastUpdate) on the corresponding row in the database.
Can anyone offer some guidance (for example is the XML optimally structured for import by SQL) and advice on how (which tasks to use) to go about setting this SSIS package up.
Ps. thanks to Darren I already have the file selection and open process sorted using a Foreach and some enumeration to select each XML file. It is what to do with the XML afterwards I need get working...
Thanks,
Drammy
bump|||
Drammy wrote:
Hi, I am writing a contingency plan for incase our SQL Server (2005) goes down. Basically when our applicatino is about to write data to the SQL Server it checks the connection. If the connection passes it outputs to SQL, if it fails it generates an XML document detailing the data so that the data can be imported later.
Each XML document needs to define which database and which table needs to be updated/inserted.
So the SQL needs to be able to read a number of XML documents and update the relevant tables according dependent on the data held within the XML; some examples:
<Output>
<Record>
<Database>DW</Database>
<Table>dbo.tblActivity</Table>
<DateTime>11-05-2007 10:03:33.099</DateTime>
<Data>
<strActivityPK>9353|HCR008</strActivityPK>
<strActivityCode>HCR008</strActivityCode>
<strActivityDescription>HCR Complete</strActivityDescription>
<datTarget>2007-05-18</datTarget>
<datActual />
<intMatterIntCode>9353</intMatterIntCode>
<intRecurringActivityFlag>0</intRecurringActivityFlag>
<intETLFlag>0</intETLFlag>
</Data>
</Record>
</Output>I need SSIS to open this XML, read the Table element and insert or update dbo.tblActivity (in this case) with the data defined in the Data element. If updating, the update should only take place if the DateTime is more recent than a field (datLastUpdate) on the corresponding row in the database.
Can anyone offer some guidance (for example is the XML optimally structured for import by SQL) and advice on how (which tasks to use) to go about setting this SSIS package up.
Ps. thanks to Darren I already have the file selection and open process sorted using a Foreach and some enumeration to select each XML file. It is what to do with the XML afterwards I need get working...
Thanks,
Drammy
If you are intending to update different tables with this, you won't be able to use a data flow. You could use the XML Task to read the XML and the Execute SQL to run a statement using the OPENXML function in SQL Server.
|||Thanks jwelch,I actually ended up using a DataFlow pulling the XML data into a temp table and then run a SQL task to update the corresponding target tables.
Drammy
|||That's a good approach too
Best XML structure for import into SQL 2005
I am writing a contingency plan for incase our SQL Server (2005) goes down. Basically when our applicatino is about to write data to the SQL Server it checks the connection. If the connection passes it outputs to SQL, if it fails it generates an XML document detailing the data so that the data can be imported later.
Each XML document needs to define which database and which table needs to be updated/inserted.
So the SQL needs to be able to read a number of XML documents and update the relevant tables according dependent on the data held within the XML; some examples:
<Output>
<Record>
<Database>DW</Database>
<Table>dbo.tblActivity</Table>
<DateTime>11-05-2007 10:03:33.099</DateTime>
<Data>
<strActivityPK>9353|HCR008</strActivityPK>
<strActivityCode>HCR008</strActivityCode>
<strActivityDescription>HCR Complete</strActivityDescription>
<datTarget>2007-05-18</datTarget>
<datActual />
<intMatterIntCode>9353</intMatterIntCode>
<intRecurringActivityFlag>0</intRecurringActivityFlag>
<intETLFlag>0</intETLFlag>
</Data>
</Record>
</Output>
I need SSIS to open this XML, read the Table element and insert or update dbo.tblActivity (in this case) with the data defined in the Data element. If updating, the update should only take place if the DateTime is more recent than a field (datLastUpdate) on the corresponding row in the database.
Can anyone offer some guidance (for example is the XML optimally structured for import by SQL) and advice on how (which tasks to use) to go about setting this SSIS package up.
Ps. thanks to Darren I already have the file selection and open process sorted using a Foreach and some enumeration to select each XML file. It is what to do with the XML afterwards I need get working...
Thanks,
Drammy
bump|||
Drammy wrote:
Hi, I am writing a contingency plan for incase our SQL Server (2005) goes down. Basically when our applicatino is about to write data to the SQL Server it checks the connection. If the connection passes it outputs to SQL, if it fails it generates an XML document detailing the data so that the data can be imported later.
Each XML document needs to define which database and which table needs to be updated/inserted.
So the SQL needs to be able to read a number of XML documents and update the relevant tables according dependent on the data held within the XML; some examples:
<Output>
<Record>
<Database>DW</Database>
<Table>dbo.tblActivity</Table>
<DateTime>11-05-2007 10:03:33.099</DateTime>
<Data>
<strActivityPK>9353|HCR008</strActivityPK>
<strActivityCode>HCR008</strActivityCode>
<strActivityDescription>HCR Complete</strActivityDescription>
<datTarget>2007-05-18</datTarget>
<datActual />
<intMatterIntCode>9353</intMatterIntCode>
<intRecurringActivityFlag>0</intRecurringActivityFlag>
<intETLFlag>0</intETLFlag>
</Data>
</Record>
</Output>I need SSIS to open this XML, read the Table element and insert or update dbo.tblActivity (in this case) with the data defined in the Data element. If updating, the update should only take place if the DateTime is more recent than a field (datLastUpdate) on the corresponding row in the database.
Can anyone offer some guidance (for example is the XML optimally structured for import by SQL) and advice on how (which tasks to use) to go about setting this SSIS package up.
Ps. thanks to Darren I already have the file selection and open process sorted using a Foreach and some enumeration to select each XML file. It is what to do with the XML afterwards I need get working...
Thanks,
Drammy
If you are intending to update different tables with this, you won't be able to use a data flow. You could use the XML Task to read the XML and the Execute SQL to run a statement using the OPENXML function in SQL Server.
|||Thanks jwelch,I actually ended up using a DataFlow pulling the XML data into a temp table and then run a SQL task to update the corresponding target tables.
Drammy
|||That's a good approach too
Tuesday, February 14, 2012
Best way to import a text file every day
that maybe that's entirely the wrong way to do it. The file in question is
not "simply" formatted, it's based on a semicolon delimiter and has a few
"informational" lines scattered through the document that need to be filtered
out (section headers).
Is this something that the DTS should be doing for us? Can it...
1) read in files that it sees are new?
2) deal with sometimes variant records?
If so, any pointers on where to start reading up? The MS docs that I've seen
so far are rather "dense".
Maury
Maury,
DTS can be used to accoplish the task. What you need is the ActiveX script
task. So check it out. It's more about VB script than about sql then.
There is also a group for DTS where you may find more help.
Quentin
"Maury Markowitz" <MauryMarkowitz@.discussions.microsoft.com> wrote in
message news:3DAC19B9-6780-4EBF-B682-9787CF0C04F9@.microsoft.com...
> I'm in the midst of writing some code to import a file, and then realized
> that maybe that's entirely the wrong way to do it. The file in question is
> not "simply" formatted, it's based on a semicolon delimiter and has a few
> "informational" lines scattered through the document that need to be
> filtered
> out (section headers).
> Is this something that the DTS should be doing for us? Can it...
> 1) read in files that it sees are new?
> 2) deal with sometimes variant records?
> If so, any pointers on where to start reading up? The MS docs that I've
> seen
> so far are rather "dense".
> Maury
|||Hi Maury
> The file in question is
> not "simply" formatted, it's based on a semicolon delimiter and has a few
> "informational" lines scattered through the document that need to be
filtered
> out (section headers).
>
the simple way to import a file is preparing the file to be imported. Can
you write a simple program to convert your file to *fix* format without
headers (or with only one header line)? Such a program can work approx. the
same time as DTS import, so the completion time will be only twice bigger
...
Vlastik
|||Thanks guys, I'll take a look. I'm very familiar with VB so this sounds like
the right way to go.
Best way to import a text file every day
that maybe that's entirely the wrong way to do it. The file in question is
not "simply" formatted, it's based on a semicolon delimiter and has a few
"informational" lines scattered through the document that need to be filtered
out (section headers).
Is this something that the DTS should be doing for us? Can it...
1) read in files that it sees are new?
2) deal with sometimes variant records?
If so, any pointers on where to start reading up? The MS docs that I've seen
so far are rather "dense".
MauryMaury,
DTS can be used to accoplish the task. What you need is the ActiveX script
task. So check it out. It's more about VB script than about sql then.
There is also a group for DTS where you may find more help.
Quentin
"Maury Markowitz" <MauryMarkowitz@.discussions.microsoft.com> wrote in
message news:3DAC19B9-6780-4EBF-B682-9787CF0C04F9@.microsoft.com...
> I'm in the midst of writing some code to import a file, and then realized
> that maybe that's entirely the wrong way to do it. The file in question is
> not "simply" formatted, it's based on a semicolon delimiter and has a few
> "informational" lines scattered through the document that need to be
> filtered
> out (section headers).
> Is this something that the DTS should be doing for us? Can it...
> 1) read in files that it sees are new?
> 2) deal with sometimes variant records?
> If so, any pointers on where to start reading up? The MS docs that I've
> seen
> so far are rather "dense".
> Maury|||Hi Maury
> The file in question is
> not "simply" formatted, it's based on a semicolon delimiter and has a few
> "informational" lines scattered through the document that need to be
filtered
> out (section headers).
>
the simple way to import a file is preparing the file to be imported. Can
you write a simple program to convert your file to *fix* format without
headers (or with only one header line)? Such a program can work approx. the
same time as DTS import, so the completion time will be only twice bigger
...
Vlastik|||Thanks guys, I'll take a look. I'm very familiar with VB so this sounds like
the right way to go.
Best way to import a text file every day
that maybe that's entirely the wrong way to do it. The file in question is
not "simply" formatted, it's based on a semicolon delimiter and has a few
"informational" lines scattered through the document that need to be filtere
d
out (section headers).
Is this something that the DTS should be doing for us? Can it...
1) read in files that it sees are new?
2) deal with sometimes variant records?
If so, any pointers on where to start reading up? The MS docs that I've seen
so far are rather "dense".
MauryMaury,
DTS can be used to accoplish the task. What you need is the ActiveX script
task. So check it out. It's more about VB script than about sql then.
There is also a group for DTS where you may find more help.
Quentin
"Maury Markowitz" <MauryMarkowitz@.discussions.microsoft.com> wrote in
message news:3DAC19B9-6780-4EBF-B682-9787CF0C04F9@.microsoft.com...
> I'm in the midst of writing some code to import a file, and then realized
> that maybe that's entirely the wrong way to do it. The file in question is
> not "simply" formatted, it's based on a semicolon delimiter and has a few
> "informational" lines scattered through the document that need to be
> filtered
> out (section headers).
> Is this something that the DTS should be doing for us? Can it...
> 1) read in files that it sees are new?
> 2) deal with sometimes variant records?
> If so, any pointers on where to start reading up? The MS docs that I've
> seen
> so far are rather "dense".
> Maury|||Hi Maury
> The file in question is
> not "simply" formatted, it's based on a semicolon delimiter and has a few
> "informational" lines scattered through the document that need to be
filtered
> out (section headers).
>
the simple way to import a file is preparing the file to be imported. Can
you write a simple program to convert your file to *fix* format without
headers (or with only one header line)? Such a program can work approx. the
same time as DTS import, so the completion time will be only twice bigger
...
Vlastik|||Thanks guys, I'll take a look. I'm very familiar with VB so this sounds like
the right way to go.