Showing posts with label text. Show all posts
Showing posts with label text. Show all posts

Tuesday, March 27, 2012

binding textbox text to a database field

hello

is it possible to to bind a single field from a database table to a textbox.

I am trying to open a web form with five textboxes on it. When it loads is it possible to have the textboxes filled with a different field from a row in a database, selected with a sql statment.

Thanks in advance NubNub

Wrong Forum.|||

Yes, that is quite possible.

But you will get better assistance if you post your question in one of the .NET ASP related forums.

sql

Binding data to text box

I want to bind some data to a text box from sql server db. but when i run the page i get an error. here is my code.

<formid="form1"runat="server"><div><asp:SqlDataSourceID="SqlDataSource1"runat="server"ConnectionString="<%$ ConnectionStrings:imacsConn %>"SelectCommand="SELECT Reportnumber FROM [SummaryBlue] WHERE REPORTNUMBER = @.REPORTNUMBER"><SelectParameters><asp:QueryStringParameterName="REPORTNUMBER"QueryStringField="REPORTNo"Type="String"/>

</SelectParameters></asp:SqlDataSource><asp:TextBoxID="TextBox1"runat="server"Columns="<%$ ConnectionStrings:imacsConn %>"></asp:TextBox></div></form>

Error:

Exception Details:System.FormatException: Input string was not in a correct format.

Source Error:

Line 25: </SelectParameters>Line 26: </asp:SqlDataSource>Line 27: <asp:TextBox ID="TextBox1" runat="server" Columns="<%$ ConnectionStrings:imacsConn %>"></asp:TextBox></div>Line 28: </form>

The Columns property of a text box is how many characters wide it should be (an integer property). You are trying to assign it a string.

<asp:TextBoxID="TextBox1"runat="server"Columns="<%$ ConnectionStrings:imacsConn %>">

|||

Hi xbhavesh,

From your description, I understand you want to bind data to single TextBox. Base on my experience and research, SqlDataSource can be binded to List data-bound controls, but TextBox is not belong to them. So I think we cannot simply bind data to TextBox from SqlDataSource.

I find two solutions and hope they are helpful to you.

1. Bind the DataSource to a GridView, and then bind the GridView cell value to TextBox. Don't forget set GridView to not visible and call DataBind method when Page Load. Here is the sample code:

<asp:TextBoxID="TextBox1"runat="server"Text='<% #GridView1.Rows[0].Cells[0].Text%>'></asp:TextBox>

<asp:SqlDataSourceID="SqlDataSource1"runat="server"ConnectionString="<%$ ConnectionStrings:testConnectionString%>"

SelectCommand="SELECT [id] FROM [Table1] WHERE ([id] = 2)">

</asp:SqlDataSource>

<asp:GridViewID="GridView1"Visible="false"runat="server"DataSourceID="SqlDataSource1">

</asp:GridView>

protectedvoid Page_Load(object sender,EventArgs e)

{

TextBox1.DataBind();

}

2. Create your custom control to inherit DataBoundControl:

http://msdn2.microsoft.com/en-us/library/system.web.ui.webcontrols.databoundcontrol.aspx

P.S. We should bind to the Text property, not Columns.

Binding a Text Box to a datasoure - Please help

I have search for answers to the all over google and every thing I have found so far did not work for some reason or anothr. .

Here is what I am looking for I am codeing in Visual Studio using C#. I have been able to crate a connection and create taxt boxes with a submit button that when it is presed enters data in to a database (Sql server) what I have been unable to do is:

1. Display data in a text box.

2. Update data

3. create buttons to navagate through fields.

I know I can do this with the gridview or datagrid but in really need a custome form for what I am doing.

Hello,

You can use a Sqldatasource to do insert, update and delete for your textboxes. You can read the data from datasource through a datareader object and bind the data you your textboxes manually. For example, txtComments.Text =Convert.ToString(dReader["Comments"]).

You can call Sqldatsource.update() to update your form data when your tesxboxes' values have been changed.

You can create a linkbutton to send in some sort of id to the page on its click event to retrieve data for these textboxes.

Or you can do everything from code behind as in ASP.NET 1.X programmatically.

If you have more questions, please post back.

Sunday, March 25, 2012

Bind a label to an SqlDataSource

Hi all,

I would like to do something I thought was simple but I can't seem to figure it out. I would like to bind the Text properties to two labels to the columns in an SqlDataSource, much the same as you can bind a listbox to a SqlDataSource.

Is this possible?

The reason why I'm trying it this way is because I originally used an OleDB connection and datareader, but it doesn't seem to work on our service providers server (keeps saying that it can't find the database) even though it works on the four other server's I've tried. It definitely connects to the database when I use the SqlDataSource on a listbox control, but it fails when I use the same connection string with the OleDB connection.

Is this the best way to go about it, or should I persist with finding the OleDB/datareader (the service provider has been no help at all).

Thanks.

If you want to use SqlDataSource, why not bind SqlDataSource with GridView? So that you can view all records:

<asp:SqlDataSource ID="mySDS" runat="server" ConnectionString="<%$ ConnectionStrings:Conn2000 %>"
SelectCommand="SELECT FirstName, LastName, Title FROM Employees">
</asp:SqlDataSource>
<asp:GridView runat="server" ID="myGV" DataSourceID="mySDS" AutoGenerateColumns="true">
</asp:GridView
BTW, what's the exception you got when connecting with OleDB connection? OleDB connection string can differ from SqlDataSource connection string, you can check http://www.connectionstrings.com/

Tuesday, March 20, 2012

big problem with data conversion

Dear colleagues, I have a big problem with data conversion. For example, on my Windows form (C#) i have one text box and in my table I have field type float. When I use following conversion or casting:

float i = (float)Convert.ToDouble(textBox1.Text);

and insert number 12,3 in my table I have following number 12.30000000000032324!

What I must to do with this conversion and have in my table number 12,3?

Hi,

although this is not a C# forum, you should take a look at the Help for a more precise datatype than float.

http://msdn2.microsoft.com/en-us/library/b1e65aza.aspx

In addition see the float definition in the BOL for more information about float in SQL Server.

You might want to have a look on the decimal type instead.

HTH, Jens K. Suessmeyer.

http://www.sqlserver2005.de

Thursday, March 8, 2012

BI Accelerator SetCurrentDay Problem

I'm having a problem with the SetCurrentDay DTS package
of the BI Accelerator.

I'm loading a set of data using text files therefore I'm using
Master_Import.dts and Master_Update.dts unchanged. Then I run the VBScripts that are alsgo generated by Accelerator (VAMPanorama1_Load_VAMPanorama.vbs).

This loads the data from text files into the Staging Database and SM Database, and also runs the Set Current Day package with a fixed date of "2001-12-31".

The problem is data for the year of the Current Date is multiplied by a factor of six. Now, when I run SetCurrentDay Package Alone, to try to set a new date (say "2003-07-01") either opening the package in Enterprise Manager and changing the global variable, or using DTSRUN utility I get the same problem. The data es multiplied by a factor of 6 for the year 2003 and data for 2002 and 2001 shows up correctly at the year, qtr and month level.

I tried this on the Retail Analysis sample project that comes with Accelerator and had exactly the same problem.

Has anoyone seen something like this? am I doing something wrong here?

Any help is appreciated,I assume you've done something else by now, but I had the same problem. After verifying that the proper number of facts were loaded, and that a join wasn't creating additional records, I stumbled on the solution: There is no problem with the cube.

The problem is that the auto-generated base time dimensions each have an 'All' level and they should not. Setting all of them (for me they are Time.Standard and Week.Standard) to 'Current' will fix the problem and show the proper number of transactions. The reason they are multiplied is that 'All' aggregates across all time periods including the actual periods and the calculated ones like Current, Last Year, Change from Last, etc.

Originally posted by rsada
I'm having a problem with the SetCurrentDay DTS package
of the BI Accelerator.

I'm loading a set of data using text files therefore I'm using
Master_Import.dts and Master_Update.dts unchanged. Then I run the VBScripts that are alsgo generated by Accelerator (VAMPanorama1_Load_VAMPanorama.vbs).

This loads the data from text files into the Staging Database and SM Database, and also runs the Set Current Day package with a fixed date of "2001-12-31".

The problem is data for the year of the Current Date is multiplied by a factor of six. Now, when I run SetCurrentDay Package Alone, to try to set a new date (say "2003-07-01") either opening the package in Enterprise Manager and changing the global variable, or using DTSRUN utility I get the same problem. The data es multiplied by a factor of 6 for the year 2003 and data for 2002 and 2001 shows up correctly at the year, qtr and month level.

I tried this on the Retail Analysis sample project that comes with Accelerator and had exactly the same problem.

Has anoyone seen something like this? am I doing something wrong here?

Any help is appreciated,|||It's been some time since I last saw this thing. But could you refresh my memery
What do you mean by "Setting all of them (for me they are Time.Standard and Week.Standard) to 'Current'"? . Is this changing the name of the All Member or removing the All member ?

thanks|||It's actually constraining the dates to 'Current' but I found the real source of the problem.

The auto-generated date dimensions have custom members built into the dimension table, and one of them is called 'Current'. My original recommendation was that you could not leave the date dimensions set to 'All' (meaning that they are unconstrained) because that caused the totals to be much higher than they should be. Setting them to 'Current' or to some other specific value (a particular year, month, etc.) would fix the totals.

Here's the real problem: the unary operator property is not set by the spreadsheet generator. If you go into Analysis Manager and edit the Time.Standard dimension, you can fix it. For each level, look at the advanced properties and click the ellipse button beside the value for Unary Operators. A dialog comes up that lets you check "Enable Unary Operators" and tell it there is an existing field of <level name>_Oper (e.g. Year_Oper). This one change (setting it for all levels) fixes all of the auto-generated date dimensions.

What's happening is that without the unary operator, calculated dates like Current, Previous, Change, %Change, etc. are treated just like any other year value (like 2004) so the cube adds together all of the values causing data to be added both for the actual date that it occurs, and to the calculated dates (current, etc.). The unary operator lets the value in <level name>_Oper determine if the values will aggregate or not, and thankfully, the attribute has the right values.|||Thanks a lot Barclay I got it workin fine now..!! :)

Thursday, February 16, 2012

Best way to load data from text files

Hi,
I have problem I'm hoping someone can give me some pointers with.

I need to load data from several text files into one table. The format of the files are simple - each line is comma separated, with double quotes around each element e.g.

"parameter 1","value 1","parameter 2","value 2"...
"parameter 12","value 12","parameter 13","value 13"...

However, the files themselves will have different numbers of columns e.g file 1 may have 8 columns, file 2 may have 16 columns.

I'm going to load the data into a table that has at least as many columns as the longest file. The table columns are all varchar, and are named simply as [Col001] [Col002] [Col003] etc...

The first two columns of this table must be left empty during the load (I use these later on), so the data entry will start at [Col003].

My question is what is the best way to do this? I thought perhaps using a BULK INSERT in a stored procedure might do the trick, but I haven't used it before and haven't got very far. I gather another approach might be to use bcp utility. Someone has also suggested a DTS package, but the filenames will be suffixed with current date/time stamp, so i don't think that will work.

My preferred appraoch would be the BULK INSERT..but i'm open to any pointers.

Many Thanks
GregYou could use BCP or BULK INSERT but you would need to know, a priori, how many columns were in the current input file and then use a different format file that matched.|||I would bring them into a staging table as the first step. Then I would have a control proc count the number of commas in a row and then call a proc set up to handle that number of columns. The called proc would then put the columns into the table designed to hold the largest number of columns.

That way, if someone comes along later and adds another file with n more columns, you merely create another proc to handle n more columns, alter your table to hold n more columns, and modify your control proc to count up to n more columns and call the appropriate sproc.

Tuesday, February 14, 2012

Best way to import a text file every day

I'm in the midst of writing some code to import a file, and then realized
that maybe that's entirely the wrong way to do it. The file in question is
not "simply" formatted, it's based on a semicolon delimiter and has a few
"informational" lines scattered through the document that need to be filtered
out (section headers).
Is this something that the DTS should be doing for us? Can it...
1) read in files that it sees are new?
2) deal with sometimes variant records?
If so, any pointers on where to start reading up? The MS docs that I've seen
so far are rather "dense".
Maury
Maury,
DTS can be used to accoplish the task. What you need is the ActiveX script
task. So check it out. It's more about VB script than about sql then.
There is also a group for DTS where you may find more help.
Quentin
"Maury Markowitz" <MauryMarkowitz@.discussions.microsoft.com> wrote in
message news:3DAC19B9-6780-4EBF-B682-9787CF0C04F9@.microsoft.com...
> I'm in the midst of writing some code to import a file, and then realized
> that maybe that's entirely the wrong way to do it. The file in question is
> not "simply" formatted, it's based on a semicolon delimiter and has a few
> "informational" lines scattered through the document that need to be
> filtered
> out (section headers).
> Is this something that the DTS should be doing for us? Can it...
> 1) read in files that it sees are new?
> 2) deal with sometimes variant records?
> If so, any pointers on where to start reading up? The MS docs that I've
> seen
> so far are rather "dense".
> Maury
|||Hi Maury

> The file in question is
> not "simply" formatted, it's based on a semicolon delimiter and has a few
> "informational" lines scattered through the document that need to be
filtered
> out (section headers).
>
the simple way to import a file is preparing the file to be imported. Can
you write a simple program to convert your file to *fix* format without
headers (or with only one header line)? Such a program can work approx. the
same time as DTS import, so the completion time will be only twice bigger
...
Vlastik
|||Thanks guys, I'll take a look. I'm very familiar with VB so this sounds like
the right way to go.

Best way to import a text file every day

I'm in the midst of writing some code to import a file, and then realized
that maybe that's entirely the wrong way to do it. The file in question is
not "simply" formatted, it's based on a semicolon delimiter and has a few
"informational" lines scattered through the document that need to be filtered
out (section headers).
Is this something that the DTS should be doing for us? Can it...
1) read in files that it sees are new?
2) deal with sometimes variant records?
If so, any pointers on where to start reading up? The MS docs that I've seen
so far are rather "dense".
MauryMaury,
DTS can be used to accoplish the task. What you need is the ActiveX script
task. So check it out. It's more about VB script than about sql then.
There is also a group for DTS where you may find more help.
Quentin
"Maury Markowitz" <MauryMarkowitz@.discussions.microsoft.com> wrote in
message news:3DAC19B9-6780-4EBF-B682-9787CF0C04F9@.microsoft.com...
> I'm in the midst of writing some code to import a file, and then realized
> that maybe that's entirely the wrong way to do it. The file in question is
> not "simply" formatted, it's based on a semicolon delimiter and has a few
> "informational" lines scattered through the document that need to be
> filtered
> out (section headers).
> Is this something that the DTS should be doing for us? Can it...
> 1) read in files that it sees are new?
> 2) deal with sometimes variant records?
> If so, any pointers on where to start reading up? The MS docs that I've
> seen
> so far are rather "dense".
> Maury|||Hi Maury
> The file in question is
> not "simply" formatted, it's based on a semicolon delimiter and has a few
> "informational" lines scattered through the document that need to be
filtered
> out (section headers).
>
the simple way to import a file is preparing the file to be imported. Can
you write a simple program to convert your file to *fix* format without
headers (or with only one header line)? Such a program can work approx. the
same time as DTS import, so the completion time will be only twice bigger
...
Vlastik|||Thanks guys, I'll take a look. I'm very familiar with VB so this sounds like
the right way to go.

Best way to import a text file every day

I'm in the midst of writing some code to import a file, and then realized
that maybe that's entirely the wrong way to do it. The file in question is
not "simply" formatted, it's based on a semicolon delimiter and has a few
"informational" lines scattered through the document that need to be filtere
d
out (section headers).
Is this something that the DTS should be doing for us? Can it...
1) read in files that it sees are new?
2) deal with sometimes variant records?
If so, any pointers on where to start reading up? The MS docs that I've seen
so far are rather "dense".
MauryMaury,
DTS can be used to accoplish the task. What you need is the ActiveX script
task. So check it out. It's more about VB script than about sql then.
There is also a group for DTS where you may find more help.
Quentin
"Maury Markowitz" <MauryMarkowitz@.discussions.microsoft.com> wrote in
message news:3DAC19B9-6780-4EBF-B682-9787CF0C04F9@.microsoft.com...
> I'm in the midst of writing some code to import a file, and then realized
> that maybe that's entirely the wrong way to do it. The file in question is
> not "simply" formatted, it's based on a semicolon delimiter and has a few
> "informational" lines scattered through the document that need to be
> filtered
> out (section headers).
> Is this something that the DTS should be doing for us? Can it...
> 1) read in files that it sees are new?
> 2) deal with sometimes variant records?
> If so, any pointers on where to start reading up? The MS docs that I've
> seen
> so far are rather "dense".
> Maury|||Hi Maury

> The file in question is
> not "simply" formatted, it's based on a semicolon delimiter and has a few
> "informational" lines scattered through the document that need to be
filtered
> out (section headers).
>
the simple way to import a file is preparing the file to be imported. Can
you write a simple program to convert your file to *fix* format without
headers (or with only one header line)? Such a program can work approx. the
same time as DTS import, so the completion time will be only twice bigger
...
Vlastik|||Thanks guys, I'll take a look. I'm very familiar with VB so this sounds like
the right way to go.

Sunday, February 12, 2012

Best way to format a letter so that text can break across a page

I'm currently using SQL reporting services to produce letters. Most of the
formatting can be achieved however I'm having problems with page breaks.
I've currently set up the report using Text Boxes in the following format:
Name
Address
Salutation
Main Content
From
I need to ensure that the Main Content follows on directly from the
Salutation field. This works fine if the letter is 1 or 3 pages long.
However, in the case where the Main Content requires a few lines on page 2 it
places all the main content onto Page 2.
How can I ensure that the Main Content always follows directly on from the
Salutation? Is it possible to do this using a table instead of text boxes?
Any help would be greatly appreciated!I should have mentioned that the report is being produced in PDF format.
I read in a newsgroup article that PDF won't split a single row between 2
pages unless the row itself is longer than one page. Is this still the case
or is there a workaround? If this it true, then one solution might be to
bring back the main content text in sections although this isn't too
appealing!
"SV" wrote:
> I'm currently using SQL reporting services to produce letters. Most of the
> formatting can be achieved however I'm having problems with page breaks.
> I've currently set up the report using Text Boxes in the following format:
> Name
> Address
> Salutation
> Main Content
> From
> I need to ensure that the Main Content follows on directly from the
> Salutation field. This works fine if the letter is 1 or 3 pages long.
> However, in the case where the Main Content requires a few lines on page 2 it
> places all the main content onto Page 2.
> How can I ensure that the Main Content always follows directly on from the
> Salutation? Is it possible to do this using a table instead of text boxes?
> Any help would be greatly appreciated!

Best way to filter out noise words from user entered search string

Hi there,
I'm implemeting a full text search on my company's site and am trying to get
around the 'Query contained only noise words' error. What's the best way of
stripping the text from the search string and then notifying the user that
the words were removed? (kind of like a google search). Is there a way to
compare the search string with the noise word file and remove all instances
of matching words?
Thanks!
-- are we all computer monkeys?
The best way to do this is to stop mssearch, replace the contents of your
noise word list with a single space and then restart MSSearch and rebuild
your catalog.
Hilary Cotter
Looking for a SQL Server replication book?
http://www.nwsu.com/0974973602.html
Looking for a FAQ on Indexing Services/SQL FTS
http://www.indexserverfaq.com
"Mr. Roundhill" <MrRoundhill@.discussions.microsoft.com> wrote in message
news:21EE62C6-74C2-4E01-B3A7-79D3612E5584@.microsoft.com...
> Hi there,
> I'm implemeting a full text search on my company's site and am trying to
get
> around the 'Query contained only noise words' error. What's the best way
of
> stripping the text from the search string and then notifying the user that
> the words were removed? (kind of like a google search). Is there a way
to
> compare the search string with the noise word file and remove all
instances
> of matching words?
> Thanks!
> --
> -- are we all computer monkeys?

Best way to do a free text search

HiI have a .net 2 website that works from a sql 2000 db. I am building a form that will allow the user to type in any search criteria and based on their input I need to produce some results (that's the simplest way to put it)I will need to search various parts of my db to find similar sounding information, I was just wondering what is the best way to do this. I had the following thoughts1) Search columns using Soundex tsql function (but not sure how good this is?)2) Remove all noise words from user input (eg, and, or, the etc...) and then use a regular expression to search the fields in the dbAlternatively are their some third party components to do can do this for me Many thanks in advance

You can full text search enable the database, and create a full text index on the column in the table you want to search. Then you will be able to search any word that appears inside that column. Here are a couple of links that might be helpful to you.

http://www.databasejournal.com/features/mssql/article.php/3441981

http://www.wherecanibuyit.co.uk/ASP/full-text-search.html

http://msdn2.microsoft.com/en-us/library/ms177652.aspx

|||

Hi

Thanks for that, it was a great help

I have noticed that their is a RowNumber method in SQL 2005, is their any way to do this functionality in SQL 2000

One method I thought of was to put the data in temp table and loop through each record and give it a number, and then pull out 10 records at a time as the user pages through.

Are their any other ways I could do this.

Many thanks in advance

|||

The short answer is yes. You may want to search for articles on Custom Paging with SQL Server 2000. There are ways to do it, but they are not as simple or clean as the ROW_NUMBER() function. Here is an article to get you started.

http://www.4guysfromrolla.com/webtech/042606-1.shtml

Friday, February 10, 2012

BEST UDF to handle char or varchar to smalldatatime converstion pr

Hi Experts,
I am importing data from text files, data is about 800GB, yeah thats right.
in different file and in different varchar format. like 01/01/2004 jan 17,
200, mm/dd/yy. it had bad data and good data, but there are millions and
millions records, and it has all kind of problems any body can think of.
is there any site or way to get any function which handle these problems,
or if any of you ever had chance to see this kinda function and pass to me.
thanking you in advace for your help..
-PermoodYou can use ISDATE to determine if the specified string can be converted to
a SQL Server datetime datatype: Examples:
SELECT ISDATE('1/01/2004')
SELECT ISDATE('jan 17, 200')
SELECT ISDATE('mm/dd/yy')
SELECT ISDATE('01/01/04')
You might consider performing data type validation and conversion as part of
your ETL process before importing data into SQL Server. This will likely
perform better than using an intermediate SQL Server staging table when you
have a lot of bad data and large volumes.
--
Hope this helps.
Dan Guzman
SQL Server MVP
"-Permood" <Permood@.discussions.microsoft.com> wrote in message
news:FD3FF986-932F-4FA0-9AD1-E027327AA100@.microsoft.com...
> Hi Experts,
> I am importing data from text files, data is about 800GB, yeah thats
> right.
> in different file and in different varchar format. like 01/01/2004 jan 17,
> 200, mm/dd/yy. it had bad data and good data, but there are millions and
> millions records, and it has all kind of problems any body can think of.
> is there any site or way to get any function which handle these problems,
>
> or if any of you ever had chance to see this kinda function and pass to
> me.
>
> thanking you in advace for your help..
> -Permood
>|||Thanks Dan,
here is problem, i am very new in my company, some 'DEVELOPERS' don't want
to change their way of doing and they decided to use sql staging, and
transformationg. I dont' have much experience with SQL Server either. So I
wanted to see if some thing is already there, i means any function, which and
handle most date problems and then i and enhance it according to my needs.
but thank you for your time and suggestions.
"Dan Guzman" wrote:
> You can use ISDATE to determine if the specified string can be converted to
> a SQL Server datetime datatype: Examples:
> SELECT ISDATE('1/01/2004')
> SELECT ISDATE('jan 17, 200')
> SELECT ISDATE('mm/dd/yy')
> SELECT ISDATE('01/01/04')
> You might consider performing data type validation and conversion as part of
> your ETL process before importing data into SQL Server. This will likely
> perform better than using an intermediate SQL Server staging table when you
> have a lot of bad data and large volumes.
> --
> Hope this helps.
> Dan Guzman
> SQL Server MVP
> "-Permood" <Permood@.discussions.microsoft.com> wrote in message
> news:FD3FF986-932F-4FA0-9AD1-E027327AA100@.microsoft.com...
> > Hi Experts,
> >
> > I am importing data from text files, data is about 800GB, yeah thats
> > right.
> > in different file and in different varchar format. like 01/01/2004 jan 17,
> > 200, mm/dd/yy. it had bad data and good data, but there are millions and
> > millions records, and it has all kind of problems any body can think of.
> >
> > is there any site or way to get any function which handle these problems,
> >
> >
> > or if any of you ever had chance to see this kinda function and pass to
> > me.
> >
> >
> > thanking you in advace for your help..
> > -Permood
> >
>
>

BEST UDF to handle char or varchar to smalldatatime converstion pr

Hi Experts,
I am importing data from text files, data is about 800GB, yeah thats right.
in different file and in different varchar format. like 01/01/2004 jan 17,
200, mm/dd/yy. it had bad data and good data, but there are millions and
millions records, and it has all kind of problems any body can think of.
is there any site or way to get any function which handle these problems,
or if any of you ever had chance to see this kinda function and pass to me.
thanking you in advace for your help..
-Permood
You can use ISDATE to determine if the specified string can be converted to
a SQL Server datetime datatype: Examples:
SELECT ISDATE('1/01/2004')
SELECT ISDATE('jan 17, 200')
SELECT ISDATE('mm/dd/yy')
SELECT ISDATE('01/01/04')
You might consider performing data type validation and conversion as part of
your ETL process before importing data into SQL Server. This will likely
perform better than using an intermediate SQL Server staging table when you
have a lot of bad data and large volumes.
Hope this helps.
Dan Guzman
SQL Server MVP
"-Permood" <Permood@.discussions.microsoft.com> wrote in message
news:FD3FF986-932F-4FA0-9AD1-E027327AA100@.microsoft.com...
> Hi Experts,
> I am importing data from text files, data is about 800GB, yeah thats
> right.
> in different file and in different varchar format. like 01/01/2004 jan 17,
> 200, mm/dd/yy. it had bad data and good data, but there are millions and
> millions records, and it has all kind of problems any body can think of.
> is there any site or way to get any function which handle these problems,
>
> or if any of you ever had chance to see this kinda function and pass to
> me.
>
> thanking you in advace for your help..
> -Permood
>