Showing posts with label ive. Show all posts
Showing posts with label ive. Show all posts

Tuesday, March 20, 2012

Big problem connecting SQLEXPRESS within a vb net application

Hello.
I've an application that has to do a bunch of "SELECT" and INSERT into a
couple of tables. On my machine everything works fine, but if I try to
deploy the app on a VM for a test i have many probs.
The most common is "(Named Pipes Provider, error: 40 - Could not open a
connection to SQL
Server)", but using (local)\SQLEXPRESS i've "Request for the permission of
type
'System.Data.SqlClient.SqlClientPermission, System.Data, version=2.0.0.0,
Culture=neutral, PublicKeyToken=b77a5c561934e89'" too...I think it's a
connectionstring problem. I used "Data Source=(local)\SQLEXPRESS;Initial
Catalog=Clienti;Integrated
Security=True", where "Clienti" is the DB, but i tried the string I found on
http://www.connectionstrings.com/?carrier=sqlserver2005 too.
It's being absurd...please give me an help...thanks...
Ah, did you enable TCP/IP on the target server? Why are you using named
pipes? If you do, you must enable additional ports in the firewall (if there
is one).
____________________________________
William (Bill) Vaughn
Author, Mentor, Consultant
Microsoft MVP
INETA Speaker
www.betav.com/blog/billva
www.betav.com
Please reply only to the newsgroup so that others can benefit.
This posting is provided "AS IS" with no warranties, and confers no rights.
__________________________________
Visit www.hitchhikerguides.net to get more information on my latest book:
Hitchhiker's Guide to Visual Studio and SQL Server (7th Edition)
and Hitchhiker's Guide to SQL Server 2005 Compact Edition (EBook)
------
"THE JOK3R" <a@.a.a> wrote in message
news:%23GL$eDNjHHA.4520@.TK2MSFTNGP02.phx.gbl...
> Hello.
> I've an application that has to do a bunch of "SELECT" and INSERT into a
> couple of tables. On my machine everything works fine, but if I try to
> deploy the app on a VM for a test i have many probs.
> The most common is "(Named Pipes Provider, error: 40 - Could not open a
> connection to SQL
> Server)", but using (local)\SQLEXPRESS i've "Request for the permission of
> type
> 'System.Data.SqlClient.SqlClientPermission, System.Data, version=2.0.0.0,
> Culture=neutral, PublicKeyToken=b77a5c561934e89'" too...I think it's a
> connectionstring problem. I used "Data Source=(local)\SQLEXPRESS;Initial
> Catalog=Clienti;Integrated
> Security=True", where "Clienti" is the DB, but i tried the string I found
> on http://www.connectionstrings.com/?carrier=sqlserver2005 too.
> It's being absurd...please give me an help...thanks...
|||"William (Bill) Vaughn" <billvaRemoveThis@.betav.com> wrote in message
news:OpGW39NjHHA.1624@.TK2MSFTNGP06.phx.gbl...
> Ah, did you enable TCP/IP on the target server? Why are you using named
> pipes? If you do, you must enable additional ports in the firewall (if
> there is one).
I both did everything...enabled TCP/IP and Remote Connection, opened port
1431 default port...
sql

Monday, March 19, 2012

Big Int to Date Conversion

Hi Guys!!!
i really need your help now. i tried my level best to make some
tricks on this problem but didn't make it.
i've got a Big Int datatype column contains the "Ticks" (Long value)
and i want to convert it into DateTime but i'm not able to do it. the
value is like this 632557193645506250 i want to convert it into
datetime format.
please help me out guys. i'm using ms sql server 2000.
and front end is vs .net 2005
i want to do it into sql Query please dont ask me the reason but
because of some constraints i can not do it in .net
PLEASE HELP ME GUYS.
hi,
Lucky wrote:
> Hi Guys!!!
> i really need your help now. i tried my level best to make some
> tricks on this problem but didn't make it.
> i've got a Big Int datatype column contains the "Ticks" (Long value)
> and i want to convert it into DateTime but i'm not able to do it. the
> value is like this 632557193645506250 i want to convert it into
> datetime format.
> please help me out guys. i'm using ms sql server 2000.
> and front end is vs .net 2005
> i want to do it into sql Query please dont ask me the reason but
> because of some constraints i can not do it in .net
>
> PLEASE HELP ME GUYS.
DATEDIFF() and DATEADD() builtin functions expect a parameter of the "int"
datatype... id you perform
SET NOCOUNT ON;
DECLARE @.d datetime, @.t bigint;
SET @.t = 632557193645506250;
SELECT @.d = DATEADD (ms, @.t, '19000101');
SELECT @.d AS [date as difference in ms from 01/01/1900];
which adds the specified num of ticks to a base data value (you are required
to know in order to perform calculation), you get
Server: Msg 8115, Level 16, State 2, Line 4
Arithmetic overflow error converting expression to data type int.
date as difference in ms from 01/01/1900
NULL
as the provided "difference" exceeds the max supported value...
http://msdn.microsoft.com/library/de...da-db_3vtw.asp
if you can "convert" that big value into separated values as "days" for the
date part and "ms" (depending on the actual precision you are required to
provide) for the time part in order not to overflow the int datatype
(32767), you can easely use the DATEADD(..) function..
Andrea Montanari (Microsoft MVP - SQL Server)
http://www.asql.biz/DbaMgr.shtmhttp://italy.mvps.org
DbaMgr2k ver 0.18.0 - DbaMgr ver 0.62.0
(my vb6+sql-dmo little try to provide MS MSDE 1.0 and MSDE 2000 a visual
interface)
-- remove DMO to reply
|||On Tue, 4 Apr 2006 19:18:30 +0200, Andrea Montanari wrote:

>if you can "convert" that big value into separated values as "days" for the
>date part and "ms" (depending on the actual precision you are required to
>provide) for the time part in order not to overflow the int datatype
>(32767), you can easely use the DATEADD(..) function..
Hi Andrea,
Eh? 32767 ?
I think you meant to write 2,147,483,647 here...
To Lucky:[vbcol=seagreen]
Can you tell me how long a "tick" lasts, and what the starting point for
the meaasurement is? And what is the exact date and time corresponding
to the 632557193645506250 value in your post?
Hugo Kornelis, SQL Server MVP
|||hi Hugo
Hugo Kornelis wrote:
> On Tue, 4 Apr 2006 19:18:30 +0200, Andrea Montanari wrote:
>
> Hi Andrea,
> Eh? 32767 ?
> I think you meant to write 2,147,483,647 here...
oopsss.. just working with vb6 these days and the relative "int" dimension..
LOL.. I apologise ..
thank you for poiting it out..
Andrea Montanari (Microsoft MVP - SQL Server)
http://www.asql.biz/DbaMgr.shtmhttp://italy.mvps.org
DbaMgr2k ver 0.18.0 - DbaMgr ver 0.62.0
(my vb6+sql-dmo little try to provide MS MSDE 1.0 and MSDE 2000 a visual
interface)
-- remove DMO to reply

Wednesday, March 7, 2012

BI Accelerator + Applications + Report Services

Hi All,
this seems like the best place for this question.
I've had cause to review the MSFT position/tools in the BI area. I'm
surprised!!! I feel like MSFT have made MUCH more progress than generally
talked about in newsgroups and customers I work with. All I ever hear from
MSFT with respect to DW/BI is 'the next release is going to be great.' Not,
'Hey, take a look at BIA/Reporting Services'...?
Q1. In BI accelerator I installed and tested out the shopfloor application
(manufacturing) and I generated the shopfloor database from the XL Sheet.
But I can't see where it got the table definitions for the staging table and
the underlying dimensional database from. For example, where are the
table/column/datatype definitions of the staging area and the dimensional
database in the XL sheets? I'm sure I looked at all the sheets in the
spreadsheet AnalyticsBuilderWB_ShopFloorPerformance.v.0.2.xls. (But maybe
I'm going crazy and didn't). To me it only seems like the analytical model
is in the spreadsheet.
Q2. I see it generates a ton of .dts files. And when I open them up they
seem to me (a zero level skilled DTS person) to be very complex. The manual
says 'don't change the packages' which I gather means don't look, don't
worry, just run it and it will all be ok....again, I gather that the DTS
packages are somehow generated by BIA into their binary format and I believe
the source to target mappings are defined by the 'mappings' spreadsheet. But
in this spreadsheet I don't seem to find enough columns for all the columns
in the dimensional model. For example I can't find mappings for dim_emp_std
in the spreadsheet but I would have thought it should be there. (Or am I
missing something)?
Q3. In BIA I see lots about sharepoint portal and office objects inside web
pages and all that as a presentation layer. But I just took a look at
Reporting Services. Reporting Services looks absolutely fantastic for what
it is trying to do and it as very obvious how easy it will be to extend
reporting services to do MUCH, MUCH more very, very easily. RDL (Report
Definition Language) is an idea long overdue and I do believe MSFT are first
with that one. I have not heard any other vendor talk about an RDL yet. I
was amazed that a report can just be exposed as a web service and you can
call it from anywhere with anything. Now THAT is a useful thing to have.
So the question is, how come BIA seems to completely ignore Reporting
Services? I would have thought they would be very closely 'joined'. For
example, why not do the front ends for BIA apps in reporting services? I am
assuming this is possible. I'm assuming reporting services can get data out
of Analysis Server because it can get data out of any ado.net server.
(I must say I thought MSF had hidden how good Report Services is very
well...unless I'm greatly mistaken, I've only had a few hours to look at it,
it looks like a really, really useful product!!!)
Anyway, thanks in advance if anyone can let me know what I am missing in
BIA.
Best Regards
Peter Nolan
www.peternolan.com
Wow. There is a lot of info here. A couple of comments:
Q1) datatypes using in BIA -- well, we make several guesses. First we know
that all measures are numeric (that is a requirement of Analysis Services).
Second we know that fields that we generate have specific uses -- and from
that we know their datatypes. For example, the surrogate keys are integers
(as you would expect because they are identity). For those fields which are
user supplies, e.g. member names -- we know just treat them as "names", i.e.
varchars. If you have something different, e.g. names which are really
integers, they will have to re-do things by-hand. For those things which
could be any datatype, e.g. member properties, you can choice the
appropriate datatypes.
If there are specific tables and columns you are wondering about just tell
me and I'd be glad to explain why we did something one way or the other.
What you are seeing the whole point of the BI Accelerator. You layout the
logical multidimensional design and we auto-generate a relational staging
area; final data mart, OLAP structures, and DTS packages to move the data
through it. Logical data model to final app in one click :-)
And that final app should have all of the tips and tricks that you would see
in a production system; not just a rough proof-of-concept system.
Q2) DTS packages in BIA -- thank you. I wrote the generator code and I am
quite proud of them.
It generates a ton of packages because of the type of schema we choose to
implement. Since we generate a snowflake schema, there are tables for every
dimension and level in the dimensional structure.
We had several goals with the DTS packages. First, we wanted them to be
data-driven because we don't expect everyone to be an expert on DTS. Thus
rather than having to make changes to them, most of them have variables you
can change value and have the package do something different. All of this is
documented in the PAG (online doc set). One of the challenges of this was
that DTS with SQL Server 2000 doesn't have the control flow tasks needed to
make this declaritive -- with 2005, we put that all in natively and similar
packages with 2005 would be quite a bit less complex.
Second, we wanted the packages to be visible and extensiveable by
knowledgeable users. Nothing is hidden -- it is all up front and in your
face. Yes, a novice will look at them and dispare -- but don't give up!
There is documentation in the PAG provided for all of them! Lastly you might
be interested in this white paper which talks about the DTS packages and
provides various tips and tricks beyond what the PAG has in it.
http://msdn.microsoft.com/library/de..._dts_ssabi.asp
If there is a specific step you have questions about, I'd be glad to help
also.
Q3) I am glad you like Reporting Services. I agree it is a fantastic tool.
However, it is a totally different product. BI Accelerator is a BI
application generator; not a report generator. It has a client component
only for customization. The idea was that suppose you had a report called a
"template" which was exactly the same from client to client, but one
customer called products "items" and another called products "books".
Wouldn't it be nice to ship a template along with the multidimensional
design and have the system automatically rename products to items or
products to books. That is what the client generator does with the BI
Accelerator. It ships with a Proclarity component which knows how to go
inside a Proclarity briefing book and replace one tag for another tag. A
similar facility is available from Panorama and there is an API available to
other vendors if they would like to plug into the BI Accelerator client
generator.
Hope this helps.
Dave Wickert [MSFT]
dwickert@.online.microsoft.com
Program Manager
BI SystemsTeam
SQL BI Product Unit (Analysis Services)
This posting is provided "AS IS" with no warranties, and confers no rights.
"Peter Nolan" <peter@.peternolan.com> wrote in message
news:uCEWHDlBFHA.2876@.TK2MSFTNGP12.phx.gbl...
> Hi All,
> this seems like the best place for this question.
> I've had cause to review the MSFT position/tools in the BI area. I'm
> surprised!!! I feel like MSFT have made MUCH more progress than generally
> talked about in newsgroups and customers I work with. All I ever hear
from
> MSFT with respect to DW/BI is 'the next release is going to be great.'
Not,
> 'Hey, take a look at BIA/Reporting Services'...?
> Q1. In BI accelerator I installed and tested out the shopfloor application
> (manufacturing) and I generated the shopfloor database from the XL Sheet.
> But I can't see where it got the table definitions for the staging table
and
> the underlying dimensional database from. For example, where are the
> table/column/datatype definitions of the staging area and the dimensional
> database in the XL sheets? I'm sure I looked at all the sheets in the
> spreadsheet AnalyticsBuilderWB_ShopFloorPerformance.v.0.2.xls. (But maybe
> I'm going crazy and didn't). To me it only seems like the analytical
model
> is in the spreadsheet.
>
> Q2. I see it generates a ton of .dts files. And when I open them up they
> seem to me (a zero level skilled DTS person) to be very complex. The
manual
> says 'don't change the packages' which I gather means don't look, don't
> worry, just run it and it will all be ok....again, I gather that the DTS
> packages are somehow generated by BIA into their binary format and I
believe
> the source to target mappings are defined by the 'mappings' spreadsheet.
But
> in this spreadsheet I don't seem to find enough columns for all the
columns
> in the dimensional model. For example I can't find mappings for
dim_emp_std
> in the spreadsheet but I would have thought it should be there. (Or am I
> missing something)?
>
> Q3. In BIA I see lots about sharepoint portal and office objects inside
web
> pages and all that as a presentation layer. But I just took a look at
> Reporting Services. Reporting Services looks absolutely fantastic for
what
> it is trying to do and it as very obvious how easy it will be to extend
> reporting services to do MUCH, MUCH more very, very easily. RDL (Report
> Definition Language) is an idea long overdue and I do believe MSFT are
first
> with that one. I have not heard any other vendor talk about an RDL yet.
I
> was amazed that a report can just be exposed as a web service and you can
> call it from anywhere with anything. Now THAT is a useful thing to have.
> So the question is, how come BIA seems to completely ignore Reporting
> Services? I would have thought they would be very closely 'joined'. For
> example, why not do the front ends for BIA apps in reporting services? I
am
> assuming this is possible. I'm assuming reporting services can get data
out
> of Analysis Server because it can get data out of any ado.net server.
> (I must say I thought MSF had hidden how good Report Services is very
> well...unless I'm greatly mistaken, I've only had a few hours to look at
it,
> it looks like a really, really useful product!!!)
> Anyway, thanks in advance if anyone can let me know what I am missing in
> BIA.
> Best Regards
> Peter Nolan
> www.peternolan.com
>
|||Hi Dave,
wow, nice to see the guy who 'wrote the code' is watching the forum.
Thanks for your feedback. Makes sense and I can see where BIA could
head in the next release.
You'll be pleased to hear this...the reason for my interest is some
colleagues and I are bringing to market a product that will be based
solely on MSFT BI technologies and part of that is we are researching
everything we can find about MSFT BI and the way forward with MSFT
BI.....I haven't seen any kind of comprehensive suite of slides that
is public in this matter...if you hapeen to know where one can be
downloaded from that would be great.
I stall have to figure out if BIA can fit into what we are doing. We
shall see.
My other comment...
"Q2) DTS packages in BIA -- thank you. I wrote the generator =ADcode and
I am
quite proud of them."
Let's just say I recognise a good idea and a smart guy when I see
one...;-)
One of my specialities is ETL and reducing the cost of writing ETL for
my clients. I've been doing DW for 14 years now and ETL consumes so
much of the money it's the obvious thing to keep working on to reduce.
About 2.5 years ago I investigated the possibility of writing a
generator for ETL jobs in Informatica and DataStage. I found it was
certainly possible to do so. The 'problem' was that it would be
impossible for an 'independent' to make any money out of it. If the
idea did well it would be trivial for the vendor to write the same
functionality and release it. My generator was planned to produce XML
as both INFA/DS can export/import jobs in XML format.
So I was impressed that you have come up with the same idea and from
looking at the time frames you must have had that idea before me...well
done..;-)...And that you are able to produce the DTS jobs in binary.
(Though I'm sure being MSFT you can see the source code/structures for
DTS.)
Me, I changed course after my evaluation and wrote my own ETL tool. It
is very productive. Rather than map fields at field level it maps at
table levels and moves fields within tables on matching column names.
So it is not actually necessary to ever define source to target
mappings anywhere. It discovers the column names at run time so that
when more columns are added there is no code to regenerate or change
and it is the changing of ETL code that is expensive in DW maintenance.
This is another reason why I was impressed with what you have
done....yours is the first example I have seen where it might be
possible to get away with ETL changes without re-testing in the
non-production environment while still using a 'real' ETL tool.
We have also settled on a spreadsheet as the way to record all
mappings. But we had not automated the generation of jobs/code because
even on very large projects the amount of time taken is very small. (I
recently built the ETL for a staging area+DW with 100 tables and 3,000
fields in 2 weeks so there didn't seem to be a need to speed it up even
further.) So your idea of putting a 'generate' button in the
spreadsheet and a whole set of ticks on what to generate out of the
spreadsheet also intrigued me. It looks like a really 'neat trick'. I
am an 'XL-dummy' so that thought had not crossed my mind before. I must
ask one of my colleagues how that was done......The thought has
crossed my mind that our spreadsheet could be extended to record other
information required for ETL generation and we could cut even the 2
weeks work we do now out by generating what we need to generate...
So, like I said, I just recognised a good idea and a smart guy...;-)
All the best...I'll be investigating BIA some more and look forward to
seeing what happens next...
Best Regards
Peter Nolan
www.peternolan.com
|||Actually the DTS generator uses the normal SQL Server APIs to create its
packages. All I did was to create the packages that I wanted; then saved the
package as VB. Then I reversed engineered the VB back into production
quality code. We never had to look at the source code for DTS at all.
Dave Wickert [MSFT]
dwickert@.online.microsoft.com
Program Manager
BI SystemsTeam
SQL BI Product Unit (Analysis Services)
This posting is provided "AS IS" with no warranties, and confers no rights.
"Peter Nolan" <peter@.peternolan.com> wrote in message
news:1107346392.279399.284200@.f14g2000cwb.googlegr oups.com...
Hi Dave,
wow, nice to see the guy who 'wrote the code' is watching the forum.
Thanks for your feedback. Makes sense and I can see where BIA could
head in the next release.
You'll be pleased to hear this...the reason for my interest is some
colleagues and I are bringing to market a product that will be based
solely on MSFT BI technologies and part of that is we are researching
everything we can find about MSFT BI and the way forward with MSFT
BI.....I haven't seen any kind of comprehensive suite of slides that
is public in this matter...if you hapeen to know where one can be
downloaded from that would be great.
I stall have to figure out if BIA can fit into what we are doing. We
shall see.
My other comment...
"Q2) DTS packages in BIA -- thank you. I wrote the generator Xcode and
I am
quite proud of them."
Let's just say I recognise a good idea and a smart guy when I see
one...;-)
One of my specialities is ETL and reducing the cost of writing ETL for
my clients. I've been doing DW for 14 years now and ETL consumes so
much of the money it's the obvious thing to keep working on to reduce.
About 2.5 years ago I investigated the possibility of writing a
generator for ETL jobs in Informatica and DataStage. I found it was
certainly possible to do so. The 'problem' was that it would be
impossible for an 'independent' to make any money out of it. If the
idea did well it would be trivial for the vendor to write the same
functionality and release it. My generator was planned to produce XML
as both INFA/DS can export/import jobs in XML format.
So I was impressed that you have come up with the same idea and from
looking at the time frames you must have had that idea before me...well
done..;-)...And that you are able to produce the DTS jobs in binary.
(Though I'm sure being MSFT you can see the source code/structures for
DTS.)
Me, I changed course after my evaluation and wrote my own ETL tool. It
is very productive. Rather than map fields at field level it maps at
table levels and moves fields within tables on matching column names.
So it is not actually necessary to ever define source to target
mappings anywhere. It discovers the column names at run time so that
when more columns are added there is no code to regenerate or change
and it is the changing of ETL code that is expensive in DW maintenance.
This is another reason why I was impressed with what you have
done....yours is the first example I have seen where it might be
possible to get away with ETL changes without re-testing in the
non-production environment while still using a 'real' ETL tool.
We have also settled on a spreadsheet as the way to record all
mappings. But we had not automated the generation of jobs/code because
even on very large projects the amount of time taken is very small. (I
recently built the ETL for a staging area+DW with 100 tables and 3,000
fields in 2 weeks so there didn't seem to be a need to speed it up even
further.) So your idea of putting a 'generate' button in the
spreadsheet and a whole set of ticks on what to generate out of the
spreadsheet also intrigued me. It looks like a really 'neat trick'. I
am an 'XL-dummy' so that thought had not crossed my mind before. I must
ask one of my colleagues how that was done......The thought has
crossed my mind that our spreadsheet could be extended to record other
information required for ETL generation and we could cut even the 2
weeks work we do now out by generating what we need to generate...
So, like I said, I just recognised a good idea and a smart guy...;-)
All the best...I'll be investigating BIA some more and look forward to
seeing what happens next...
Best Regards
Peter Nolan
www.peternolan.com
|||Hi Dave,
interesting...I am not up on DTS and I had never heard that the
packages could be saved to VB...I must look into DTS when the next
release comes out.....
I've ask one of my XL knowledgable guys how to put something like a
'generate' button into the spreadsheets we use to write our mappings
and to call some C++ code we have......In fact, we type the
definitions into a spreadsheet and then drop them into the database to
run the code to generate the tables/views!!! (LOL) This is when we
aren't using a data modelling tool. Most places have a standard tool
that must be used for table definitions...
Best Regards
Peter Nolan

Friday, February 24, 2012

Beta Data Conversion

I have a data file in Beta 2 version, but now I've already uninstalled the
beta version and installed a new Release Version of SQLServer 2005. Does
anybody know how to convert the data file to be usable in SQLServer 2005?
Jeffry
Are you talking about .MDF datafile? If you are , have you tried to attach
the file? There are two system stored procedures as sp_detach_db,
sp_attach_db
"Jeffrey Kam" <Jeffrey Kam@.discussions.microsoft.com> wrote in message
news:913ABCEF-F02C-44A2-B9C0-881A7452D331@.microsoft.com...
>I have a data file in Beta 2 version, but now I've already uninstalled the
> beta version and installed a new Release Version of SQLServer 2005. Does
> anybody know how to convert the data file to be usable in SQLServer 2005?
|||i tried to attach directly but it said it doesn't support beta version
"Uri Dimant" wrote:

> Jeffry
> Are you talking about .MDF datafile? If you are , have you tried to attach
> the file? There are two system stored procedures as sp_detach_db,
> sp_attach_db
>
>
>
> "Jeffrey Kam" <Jeffrey Kam@.discussions.microsoft.com> wrote in message
> news:913ABCEF-F02C-44A2-B9C0-881A7452D331@.microsoft.com...
>
>
|||Yep, I see. Probably RESTORE command will not work as well.
Do you have still BETA CD to install it and transfer the data ?
Unfortanutally I have no idea (if above does not work) how to restore the
data
"Jeffrey Kam" <JeffreyKam@.discussions.microsoft.com> wrote in message
news:EE6DD8D9-FE2E-4566-903F-05F3748C2E7A@.microsoft.com...[vbcol=seagreen]
>i tried to attach directly but it said it doesn't support beta version
> "Uri Dimant" wrote:
|||Thank you for your reply
Sorry, I still couldn't find it. that's why I want a solution without setup
another Beta 2. Does Microsoft provide such tools?
"Uri Dimant" wrote:

> Yep, I see. Probably RESTORE command will not work as well.
> Do you have still BETA CD to install it and transfer the data ?
>
> Unfortanutally I have no idea (if above does not work) how to restore the
> data
>
> "Jeffrey Kam" <JeffreyKam@.discussions.microsoft.com> wrote in message
> news:EE6DD8D9-FE2E-4566-903F-05F3748C2E7A@.microsoft.com...
>
>
|||Hi
I am not aware of such tools , sorry
"Jeffrey Kam" <JeffreyKam@.discussions.microsoft.com> wrote in message
news:BF911B6B-A4F6-4740-8DBA-46C09EB9BF20@.microsoft.com...[vbcol=seagreen]
> Thank you for your reply
> Sorry, I still couldn't find it. that's why I want a solution without
> setup
> another Beta 2. Does Microsoft provide such tools?
> "Uri Dimant" wrote:

Thursday, February 16, 2012

best way to preserve all dependencies

I know ive asked this before and created seperate publications for tables
and views,etc in order to avoid conflicts such as view being created before
the table,etc..
What do u guys prefer as the best method to handle this, especially if
reinitialize a publication that has tables in it and it results in error as
the table properties has a drop to it and hence would not be able to drop as
a view is present..
There must be better ways to handle all of this . If only I could create one
publication for all objects belonging to one database ..if only all of the
dependencies were ordered.
Hassan,
like you, I have separate publications - one for tables, one for views and
one for UDFs.
AFAIK, dropping a table should be fine unless the view has been created
using schemabinding.
For those cases where dependency info of views is incorrect, sp_refreshview
is sometimes useful; sometimes use sp_addscriptexec; sometimes just add the
view manually on the subscriber when there is connectivity and there is only
one subscriber.
HTH,
Paul Ibison SQL Server MVP, www.replicationanswers.com
(recommended sql server 2000 replication book:
http://www.nwsu.com/0974973602p.html)