Showing posts with label custom. Show all posts
Showing posts with label custom. Show all posts

Tuesday, March 27, 2012

Binding dataset to a reportviewer control

Hi,

I need to bind my custom dataset, which i retrieve by executing some query to a reportviewer control. I don't want to create a typed dataset in the application. Is there any way to do it? The dataset that i bind will be built in a seperate class. If I plan to use a Object Data source, what are the steps to follow. Pls help.

thanks,

Saravanan

www.gotreportviewer.com provides information about how to use object data sources specifically at: http://www.gotreportviewer.com/objectdatasources/index.html

-- Robert

|||

Hi Robert,

Thanks for the reply. I already read through that article. What i really need is to bind a dataset ( which i create in my application) to be bound to the report viewer. The dataset will have some columns from the database ( thro a select query), some columns may be added by me. I want to bind this custom dataset to the report viewer control (local mode processing). Please let me know if this can be done and also some steps to do it.

Thanks in advance,

Saravanan.

Saturday, February 25, 2012

Better method to count records in Custom Paging for SQL Server 2005

heres my problem, since I migrated to SQL-Server 2005, I was able to use theRow_Number() Over Method to make my Custom Paging Stored Procedure better. But theres onte thing that is still bothering me, and its the fact the Im still using and old and classic Count instruction to find my total of Rows, which slow down a little my Stored Procedure. What I want to know is: Is there a way to use something more efficiant to count theBig Total of Rows without using the Count instruction? heres my stored procedure:

SELECT RowNum, morerecords, Ad_Id FROM (Select ROW_NUMBER() OVER (ORDER BY Ad_Id) AS RowNum,morerecords = (Select Count(Ad_Id) From Ads) FROM Ads) as test
WHERE RowNum Between 11 AND 20

The green part is the problem, the fields morerecords is the one Im using to count all my records, but its a waste of performance to use that in a custom paging method (since it will check every records, normally, theres a ton of condition with a lot of inner join, but I simplified things in my exemple)...I hope I was clear enough in my explication, and that someone will be able to help me. Thank for your time.

Well, since you want to join a single value (the row count) of a table with other columns from the table, the single value must be returned as a result set from a subquery or a join table. If you don't like using count(Ad_Id) to get the row count, you can join the sysindexes table to get the row count for a specific table. For example:

SELECT RowNum, morerecords, Ad_Id,RowCnt
FROM (Select ROW_NUMBER() OVER (ORDER BY Ad_Id) AS RowNum
FROM Ads) as test,sysindexes s
WHERE RowNum Between 11 AND 20
and s.id=object_id('Ads')
and s.indid=(select min(indid)
from sysindexes where id=object_id('Ads'))

If you have a cluster index on the table, you can replace the green part with 1.

Friday, February 10, 2012

best way to compile thousands of TSQL stored procedures?

I have a custom application that on occasion requires thousands of TSQL
files (on the file system) to be compiled to the database.

What is the quickest way to accomplish this?

We currently have a small vbs script that gets a list of all the files,
the loops around a call to "osql". each call to osql opens/closes a
connection to the destination database (currently across the network).<murray_shane56@.hotmail.com> wrote in message
news:1110557746.585156.86170@.f14g2000cwb.googlegro ups.com...
>I have a custom application that on occasion requires thousands of TSQL
> files (on the file system) to be compiled to the database.
> What is the quickest way to accomplish this?
> We currently have a small vbs script that gets a list of all the files,
> the loops around a call to "osql". each call to osql opens/closes a
> connection to the destination database (currently across the network).

Since text files compress well, you could zip them up, FTP or copy them to
the server, then unzip them and run your vbs script on the server side
(using xp_cmdshell, a scheduled job, DTS etc.).

Also, are you able to reduce the number of files you run? Do you change
thousands of procedures at a time, or are you able to use your source
control system to identify only the objects which have been modified?

Simon|||(murray_shane56@.hotmail.com) writes:
> I have a custom application that on occasion requires thousands of TSQL
> files (on the file system) to be compiled to the database.
> What is the quickest way to accomplish this?
> We currently have a small vbs script that gets a list of all the files,
> the loops around a call to "osql". each call to osql opens/closes a
> connection to the destination database (currently across the network).

VBS is not my best game, but I would expect it to be possible to use
ADO from VB Script. Thus, you could open a connection, and a command
object, and the run .Execute with the option adExecuteNoRecords.

This will not only save you from opening an closing the connection;
but also from a spawning an OSQL process for each procedure.

--
Erland Sommarskog, SQL Server MVP, esquel@.sommarskog.se

Books Online for SQL Server SP3 at
http://www.microsoft.com/sql/techin.../2000/books.asp|||"Simon Hayes" <sql@.hayes.ch> wrote in message
news:4231d822$1_2@.news.bluewin.ch...
> <murray_shane56@.hotmail.com> wrote in message
> news:1110557746.585156.86170@.f14g2000cwb.googlegro ups.com...
>>I have a custom application that on occasion requires thousands of TSQL
>> files (on the file system) to be compiled to the database.
>>
>> What is the quickest way to accomplish this?
>>
>> We currently have a small vbs script that gets a list of all the files,
>> the loops around a call to "osql". each call to osql opens/closes a
>> connection to the destination database (currently across the network).
>>
> Since text files compress well, you could zip them up, FTP or copy them to
> the server, then unzip them and run your vbs script on the server side
> (using xp_cmdshell, a scheduled job, DTS etc.).
> Also, are you able to reduce the number of files you run? Do you change
> thousands of procedures at a time, or are you able to use your source
> control system to identify only the objects which have been modified?
> Simon

Ditto on the
moving operation to the server and
seeing if you trim the number of objects down.

If there are no object dependencies on order of execution,
I would look into multi-threading this operation as well.

I would write out a series of CMD file scripts calling OSQL with your
existing vb script.
And then call a master CMD script that uses START to run eacho of the
sub-CMD scripts in its own process.*

Also, make sure that you are using integrated security with OSQL as I recall
it runs faster than SQL security.

* Don't use the START before each call to OSQL, or you will end up like
mickey did in that movie with all those brooms).|||(murray_shane56@.hotmail.com) writes:
> I have a custom application that on occasion requires thousands of TSQL
> files (on the file system) to be compiled to the database.
> What is the quickest way to accomplish this?
> We currently have a small vbs script that gets a list of all the files,
> the loops around a call to "osql". each call to osql opens/closes a
> connection to the destination database (currently across the network).

One more thing, if you continue to use OSQL, be sure to specify the
-I option to have SET QUOTED_IDENTIFIERS ON. This is good if you use
indexed views or indexed computed columns.

--
Erland Sommarskog, SQL Server MVP, esquel@.sommarskog.se

Books Online for SQL Server SP3 at
http://www.microsoft.com/sql/techin.../2000/books.asp|||Hi

You may want to look at:
http://tinyurl.com/5299q

Another alternative is to concatenate the files before running them.

John

<murray_shane56@.hotmail.com> wrote in message
news:1110557746.585156.86170@.f14g2000cwb.googlegro ups.com...
>I have a custom application that on occasion requires thousands of TSQL
> files (on the file system) to be compiled to the database.
> What is the quickest way to accomplish this?
> We currently have a small vbs script that gets a list of all the files,
> the loops around a call to "osql". each call to osql opens/closes a
> connection to the destination database (currently across the network).

best way to clean up temp files

I have a custom Data Flow task that creates temp files to the system temp directory during processing. A lot of times, we'll use SSIS to do one data transformation, running and tweaking the package along the way... we do this in the designer ... if we notice something that's incorrect in the data view, we just hit the stop button and fix it. However, when we do this, the Cleanup() function isn't called, and my temp files are left in the temp directory, when they really ought to be disposed of.

Is there a method that gets called every time when the DtsDebugHost quits, whether it finished, didn't finish properly, or was stopped in the middle? What would be a good way (other than having some service that monitors what temp files are used by what processes) to clean up temp files after we don't need them?

~Steve

There is no way to do this in SSIS since stopping a package using the designer is just like stopping a process while using a debugger (no cleanup happens, generally speaking). However, there is a open mode called delete on close (in native code, I don't know if this maps the managed code too) and you could use that setting to have the OS perform the cleanup.

Thanks,

Matt