Showing posts with label thousands. Show all posts
Showing posts with label thousands. Show all posts

Sunday, February 19, 2012

Best way to search a table based on another table's data?

I have a table of keywords (hundreds/thousands of records):

KeywordID KeywordName
--
1 Apple
2 Orange
3 Pear


I then have a table of products (also hundreds/thousands of records):

ProductID ProductName ProductDescription
--
123 Apple Tree Better than an orange tree, this...
124 Great Scent This great scent smells like orange...


What's the most efficent way to search the entire product table and return all records that have any of the keywords from the keyword table (in eiter productname or peoductdescription)?

Thanks,
Ron

Here is one way... I used a join with a like. This will give you one row per matching keyword:

Code Snippet

create table keyword
(
KeywordId int primary key,
KeywordName varchar(10)
)
insert into keyword
select 1,'Apple'
union all
select 2,'Orange'
union all
select 3,'Pear'
go

create table product
(
ProductId int primary key,
ProductName varchar(20),
ProductDescription varchar(40)
)
insert into product
select 123,'Apple Tree','Better than an orange tree, this...'
union all
select 124,'Great Scent','This great scent smells like orange...'
go

select Keyword.KeywordName, Product.ProductName, Product.ProductDescription

from Product
join Keyword
on Product.ProductName like '%' + Keyword.KeywordName + '%'
or Product.ProductDescription like '%' + Keyword.KeywordName + '%'

Which returns:

Code Snippet

KeywordName ProductName ProductDescription

-- -- -

Apple Apple Tree Better than an orange tree, this...

Orange Apple Tree Better than an orange tree, this...

Orange Great Scent This great scent smells like orange...

Or if you don't want duplicates, you could use:

Code Snippet

select Product.ProductName, Product.ProductDescription
from Product
where exists (select *
from keyword
where Product.ProductName like '%' + Keyword.KeywordName + '%'
or Product.ProductDescription like '%' + Keyword.KeywordName + '%')

Another possibility is to use full text search...|||select p.*
from keywords k inner join products p
on p.ProductName like '%' + k.KeywordName + '%'
or p.ProductDescription like '%' + k.KeywordName + '%'

Friday, February 10, 2012

best way to compile thousands of TSQL stored procedures?

I have a custom application that on occasion requires thousands of TSQL
files (on the file system) to be compiled to the database.

What is the quickest way to accomplish this?

We currently have a small vbs script that gets a list of all the files,
the loops around a call to "osql". each call to osql opens/closes a
connection to the destination database (currently across the network).<murray_shane56@.hotmail.com> wrote in message
news:1110557746.585156.86170@.f14g2000cwb.googlegro ups.com...
>I have a custom application that on occasion requires thousands of TSQL
> files (on the file system) to be compiled to the database.
> What is the quickest way to accomplish this?
> We currently have a small vbs script that gets a list of all the files,
> the loops around a call to "osql". each call to osql opens/closes a
> connection to the destination database (currently across the network).

Since text files compress well, you could zip them up, FTP or copy them to
the server, then unzip them and run your vbs script on the server side
(using xp_cmdshell, a scheduled job, DTS etc.).

Also, are you able to reduce the number of files you run? Do you change
thousands of procedures at a time, or are you able to use your source
control system to identify only the objects which have been modified?

Simon|||(murray_shane56@.hotmail.com) writes:
> I have a custom application that on occasion requires thousands of TSQL
> files (on the file system) to be compiled to the database.
> What is the quickest way to accomplish this?
> We currently have a small vbs script that gets a list of all the files,
> the loops around a call to "osql". each call to osql opens/closes a
> connection to the destination database (currently across the network).

VBS is not my best game, but I would expect it to be possible to use
ADO from VB Script. Thus, you could open a connection, and a command
object, and the run .Execute with the option adExecuteNoRecords.

This will not only save you from opening an closing the connection;
but also from a spawning an OSQL process for each procedure.

--
Erland Sommarskog, SQL Server MVP, esquel@.sommarskog.se

Books Online for SQL Server SP3 at
http://www.microsoft.com/sql/techin.../2000/books.asp|||"Simon Hayes" <sql@.hayes.ch> wrote in message
news:4231d822$1_2@.news.bluewin.ch...
> <murray_shane56@.hotmail.com> wrote in message
> news:1110557746.585156.86170@.f14g2000cwb.googlegro ups.com...
>>I have a custom application that on occasion requires thousands of TSQL
>> files (on the file system) to be compiled to the database.
>>
>> What is the quickest way to accomplish this?
>>
>> We currently have a small vbs script that gets a list of all the files,
>> the loops around a call to "osql". each call to osql opens/closes a
>> connection to the destination database (currently across the network).
>>
> Since text files compress well, you could zip them up, FTP or copy them to
> the server, then unzip them and run your vbs script on the server side
> (using xp_cmdshell, a scheduled job, DTS etc.).
> Also, are you able to reduce the number of files you run? Do you change
> thousands of procedures at a time, or are you able to use your source
> control system to identify only the objects which have been modified?
> Simon

Ditto on the
moving operation to the server and
seeing if you trim the number of objects down.

If there are no object dependencies on order of execution,
I would look into multi-threading this operation as well.

I would write out a series of CMD file scripts calling OSQL with your
existing vb script.
And then call a master CMD script that uses START to run eacho of the
sub-CMD scripts in its own process.*

Also, make sure that you are using integrated security with OSQL as I recall
it runs faster than SQL security.

* Don't use the START before each call to OSQL, or you will end up like
mickey did in that movie with all those brooms).|||(murray_shane56@.hotmail.com) writes:
> I have a custom application that on occasion requires thousands of TSQL
> files (on the file system) to be compiled to the database.
> What is the quickest way to accomplish this?
> We currently have a small vbs script that gets a list of all the files,
> the loops around a call to "osql". each call to osql opens/closes a
> connection to the destination database (currently across the network).

One more thing, if you continue to use OSQL, be sure to specify the
-I option to have SET QUOTED_IDENTIFIERS ON. This is good if you use
indexed views or indexed computed columns.

--
Erland Sommarskog, SQL Server MVP, esquel@.sommarskog.se

Books Online for SQL Server SP3 at
http://www.microsoft.com/sql/techin.../2000/books.asp|||Hi

You may want to look at:
http://tinyurl.com/5299q

Another alternative is to concatenate the files before running them.

John

<murray_shane56@.hotmail.com> wrote in message
news:1110557746.585156.86170@.f14g2000cwb.googlegro ups.com...
>I have a custom application that on occasion requires thousands of TSQL
> files (on the file system) to be compiled to the database.
> What is the quickest way to accomplish this?
> We currently have a small vbs script that gets a list of all the files,
> the loops around a call to "osql". each call to osql opens/closes a
> connection to the destination database (currently across the network).