Showing posts with label services. Show all posts
Showing posts with label services. Show all posts

Tuesday, March 27, 2012

Bind SQL reporting services (Reports) to ASP.NET (Urgent)

Hi all,

Currently I have a project that require me to use reporting services (Reports) together with ASP.net web application.

However, I have the following problem.

I have an asp.net web application. From there, I have a textbox whereby user can type a specific date. The specific date should then be incorporate with the sql reporting services. The report that is being generated will contain data that is on the specific date.

How can I do that?

I try to search through the internet but to no result. Can anyone help out?

Thank you so much.Read a book on Reporting Services ... you can pass everything to Reporting Services through the webservices it exposes ...
Microsoft has some good books about Reporting Services ... Search for William Vaugh ... He's very good

Bind my own DataSet that I get in my WebApp and show in the Reporting Services - is it pos

All,
Does anybody know if I can bind my own DataSet dynamically getting it in my
WebApp and show it in the Reporting Services using the services as a
formatter/viewer of this DataSet - is it possible?
Or all I can do with the RS 2000 is just a new VD for each new report and I
have to use this URL to show anything from my app? I didn't see anything
helpful allowing us to integrate the Reporting Services into WebApp yet on
the fly. What's the difference if the RS generates the DataSet or it's
generated by my application? The second way is much more flexible and
practical. Is it already implemented in the RS 2005? I saw the controls, but
didn't test them yet since my solution is not convertible to the VS2005 with
the new Reporting Services.
Just D.VS 2005 have two new controls. Webform and winform. You can use them in
local mode and give them the dataset. In local mode you do have more you
have to do (subforms etc).
In 2000 you would have to create a data processing extension (which would
work for either 2000 or 2005).
RS is designed as a service oriented application. You request a service and
RS implements it. You can call stored procedures easily or use SQL
statement. Pass in the parameters from your app and let RS create the
dataset.
Any particular reason you want to be passing a dataset. If you let RS handle
it then it will automatically do a lot for you. For instance, take the issue
of subreports. If you are handling the dataset then for each subreport you
need to respond to an event and generate the datasets for the subreport as
well.
Bruce Loehle-Conger
MVP SQL Server Reporting Services
"Just D." <no@.spam.please> wrote in message
news:Kk5If.13830$eR.455@.fed1read03...
> All,
> Does anybody know if I can bind my own DataSet dynamically getting it in
> my WebApp and show it in the Reporting Services using the services as a
> formatter/viewer of this DataSet - is it possible?
> Or all I can do with the RS 2000 is just a new VD for each new report and
> I have to use this URL to show anything from my app? I didn't see anything
> helpful allowing us to integrate the Reporting Services into WebApp yet on
> the fly. What's the difference if the RS generates the DataSet or it's
> generated by my application? The second way is much more flexible and
> practical. Is it already implemented in the RS 2005? I saw the controls,
> but didn't test them yet since my solution is not convertible to the
> VS2005 with the new Reporting Services.
> Just D.
>|||I've got a case where I need to bind to a dataset that I fetch. In my
case, I need to get a stored definition of a user-customized query, and
generate RDL from that definition. I then need to fill a weakly-typed
dataset to be used as the data source for the generated report.
Any reference on how to use pre-filled datasets in SRSS? I've been
searching Google and MSDN, and have so far come up short.
Thanks in advance.
Bruce L-C [MVP] wrote:
> VS 2005 have two new controls. Webform and winform. You can use them in
> local mode and give them the dataset. In local mode you do have more you
> have to do (subforms etc).
> In 2000 you would have to create a data processing extension (which would
> work for either 2000 or 2005).
> RS is designed as a service oriented application. You request a service and
> RS implements it. You can call stored procedures easily or use SQL
> statement. Pass in the parameters from your app and let RS create the
> dataset.
> Any particular reason you want to be passing a dataset. If you let RS handle
> it then it will automatically do a lot for you. For instance, take the issue
> of subreports. If you are handling the dataset then for each subreport you
> need to respond to an event and generate the datasets for the subreport as
> well.
>
> --
> Bruce Loehle-Conger
> MVP SQL Server Reporting Services
> "Just D." <no@.spam.please> wrote in message
> news:Kk5If.13830$eR.455@.fed1read03...
> > All,
> >
> > Does anybody know if I can bind my own DataSet dynamically getting it in
> > my WebApp and show it in the Reporting Services using the services as a
> > formatter/viewer of this DataSet - is it possible?
> >
> > Or all I can do with the RS 2000 is just a new VD for each new report and
> > I have to use this URL to show anything from my app? I didn't see anything
> > helpful allowing us to integrate the Reporting Services into WebApp yet on
> > the fly. What's the difference if the RS generates the DataSet or it's
> > generated by my application? The second way is much more flexible and
> > practical. Is it already implemented in the RS 2005? I saw the controls,
> > but didn't test them yet since my solution is not convertible to the
> > VS2005 with the new Reporting Services.
> >
> > Just D.
> >
> >|||If you are creating your own RDL then the way to go is to definitely get the
new controls with VS 2005. In local mode you don't need any server. You can
give the report the rdlc and the dataset.
Bruce Loehle-Conger
MVP SQL Server Reporting Services
"Kevin L." <kevin.lowe0@.gmail.com> wrote in message
news:1139862643.634896.4400@.o13g2000cwo.googlegroups.com...
> I've got a case where I need to bind to a dataset that I fetch. In my
> case, I need to get a stored definition of a user-customized query, and
> generate RDL from that definition. I then need to fill a weakly-typed
> dataset to be used as the data source for the generated report.
> Any reference on how to use pre-filled datasets in SRSS? I've been
> searching Google and MSDN, and have so far come up short.
> Thanks in advance.
>
> Bruce L-C [MVP] wrote:
>> VS 2005 have two new controls. Webform and winform. You can use them in
>> local mode and give them the dataset. In local mode you do have more you
>> have to do (subforms etc).
>> In 2000 you would have to create a data processing extension (which would
>> work for either 2000 or 2005).
>> RS is designed as a service oriented application. You request a service
>> and
>> RS implements it. You can call stored procedures easily or use SQL
>> statement. Pass in the parameters from your app and let RS create the
>> dataset.
>> Any particular reason you want to be passing a dataset. If you let RS
>> handle
>> it then it will automatically do a lot for you. For instance, take the
>> issue
>> of subreports. If you are handling the dataset then for each subreport
>> you
>> need to respond to an event and generate the datasets for the subreport
>> as
>> well.
>>
>> --
>> Bruce Loehle-Conger
>> MVP SQL Server Reporting Services
>> "Just D." <no@.spam.please> wrote in message
>> news:Kk5If.13830$eR.455@.fed1read03...
>> > All,
>> >
>> > Does anybody know if I can bind my own DataSet dynamically getting it
>> > in
>> > my WebApp and show it in the Reporting Services using the services as a
>> > formatter/viewer of this DataSet - is it possible?
>> >
>> > Or all I can do with the RS 2000 is just a new VD for each new report
>> > and
>> > I have to use this URL to show anything from my app? I didn't see
>> > anything
>> > helpful allowing us to integrate the Reporting Services into WebApp yet
>> > on
>> > the fly. What's the difference if the RS generates the DataSet or it's
>> > generated by my application? The second way is much more flexible and
>> > practical. Is it already implemented in the RS 2005? I saw the
>> > controls,
>> > but didn't test them yet since my solution is not convertible to the
>> > VS2005 with the new Reporting Services.
>> >
>> > Just D.
>> >
>> >
>

Thursday, March 22, 2012

Big-IP and SSL

Hi! I am working on some issues related to Reporting Services and SSL, I
wanted to know:
- If I install a NLB cluster and I add "n" servers, how many SSL
Certificates will I need? 1 or "n"?
- does anyone know if there are any issue related to MRS, SSL and the
Big-IP Load Balancer?
Thank you very much in advance :-)
Jose
--
------
Jose Ignacio Rodas, A+, CCEA, MCSEI use BigIP's in production, LVS' in test and development. If I understand
your first question about NLB clusters, you will need one certificate per
domain name. If your NLB cluster, comprised of "n" servers, answers up to
www.mydomain.com then you will need a certificate on www.mydomain.com. I
worked with NLB in 2000 and do not remember certificate management as being a
part of it, I could be wrong, not sure about 2003. As far as the BigIP and
SQL RS, good luck. It will work, if configured properly. Supposedly SP2 of
SQL RS added a feature to support SSL termination prior to the web server if
the proper HTTP headers are passed. I was able to get Reports working, but
not ReportServer. The information is available in the SP2 update
documentation. Hope this helps.
"Jose Ignacio Rodas" wrote:
> Hi! I am working on some issues related to Reporting Services and SSL, I
> wanted to know:
> - If I install a NLB cluster and I add "n" servers, how many SSL
> Certificates will I need? 1 or "n"?
> - does anyone know if there are any issue related to MRS, SSL and the
> Big-IP Load Balancer?
> Thank you very much in advance :-)
> Jose
> --
>
> ------
> Jose Ignacio Rodas, A+, CCEA, MCSE|||http://download.microsoft.com/download/5/1/3/513534ae-a0e7-44e6-9a04-ba3c549a5f5f/sp2Readme_EN.htm#_http_headers
That is the SP2 readme that talks about SSL termination
"Brian" wrote:
> I use BigIP's in production, LVS' in test and development. If I understand
> your first question about NLB clusters, you will need one certificate per
> domain name. If your NLB cluster, comprised of "n" servers, answers up to
> www.mydomain.com then you will need a certificate on www.mydomain.com. I
> worked with NLB in 2000 and do not remember certificate management as being a
> part of it, I could be wrong, not sure about 2003. As far as the BigIP and
> SQL RS, good luck. It will work, if configured properly. Supposedly SP2 of
> SQL RS added a feature to support SSL termination prior to the web server if
> the proper HTTP headers are passed. I was able to get Reports working, but
> not ReportServer. The information is available in the SP2 update
> documentation. Hope this helps.
> "Jose Ignacio Rodas" wrote:
> > Hi! I am working on some issues related to Reporting Services and SSL, I
> > wanted to know:
> >
> > - If I install a NLB cluster and I add "n" servers, how many SSL
> > Certificates will I need? 1 or "n"?
> >
> > - does anyone know if there are any issue related to MRS, SSL and the
> > Big-IP Load Balancer?
> >
> > Thank you very much in advance :-)
> >
> > Jose
> > --
> >
> >
> >
> > ------
> > Jose Ignacio Rodas, A+, CCEA, MCSE

Tuesday, March 20, 2012

Big picture question about Reporting Services

I am an beginner who has completed one report under RS. I'm wondering if my
memory is playing tricks on me. This report has been rather easy to do.
I've hardly had to look at the documentation. (I should point out that I'm
temporarily using the SQL Express version because Someone misplaced the disks
for the full version.)
My question is this: I was under the impression that Reporting Services
involved actual Dot Net programming. At least that was the impression I got
from the early demos I saw, but it has been a while. Is there more to RS
than just dragging fields on a designer?On May 15, 8:51 am, B. Chernick <BChern...@.discussions.microsoft.com>
wrote:
> I am an beginner who has completed one report under RS. I'm wondering if my
> memory is playing tricks on me. This report has been rather easy to do.
> I've hardly had to look at the documentation. (I should point out that I'm
> temporarily using the SQL Express version because Someone misplaced the disks
> for the full version.)
> My question is this: I was under the impression that Reporting Services
> involved actual Dot Net programming. At least that was the impression I got
> from the early demos I saw, but it has been a while. Is there more to RS
> than just dragging fields on a designer?
There does not have to be any actual programming, even though you are
in Visual Studio. Demos or classes may have examples where you could
code an assembly and then call it from the report.|||You can write expressions in reporting services using visual basic.
No .net programming is required to create reports.
However, because the report definiton files (RDL) are basically xml files
then the system is wide open for customisation.
For example, I have a SQL stored procedure that generates hundreds of
reports automatically for me.
Now that is real power.sql

Big performance problem with reports

Hello everyone!
I have major performance problems with reports that I have deployed in
Reporting Services.
I have an Sql database with millions of rows, and I have created a cube in
Analysis Services that is run against a view in the database.
My reports get data from the cube and the dataset is created with Sql syntax.
I need to filter the report and needed to create parameters, but only
prameters that give the option of choosing "all", i.e. being optional
parameters. The only way I can do that, as far as I know, is by creating the
filter parameter datasets in Sql Syntax. That is to be able to use union as
in
"select 'All'
union
select the real dataset
group by value" syntax. And it seems that I can't use union when running the
sql statement on a cube, so I have to run the statement on the original
database view.
Now the problem I have is that it takes about half an hour or more to view
the report, after choosing values in some filter parameters!
As far as I can see, the group by, is the statement that takes such a long
time. But that can't be the only problem.
So... does anyone know what I can do to reduce the time it takes to process
a report and what performance enhancements I can do?
I also have a problem with a specific report that gets a
System.OutOfMemoryException when I try to view it in the preview pane. But
when I run the dataset generating the report data, there are no problems.
Does anybody know what the problem might be?
I'm in a tight place and would appreciate any fast responses.
Thanks a lot,
TaraDo you have a stored procedure or is the sql code inside the RDL?
If you put as much as possible on the SQL Server, you can run it
through Query Analyzer and look at query plans etc. It does sound like
you are passing a lot more data around than you need, so get all the
SQL code out of the RDL and see where that gets you.

Sunday, March 11, 2012

BI Studio shuts automatically

I am having a weird trouble with BI Studio. Whenever I start a integration services project the BI Studio just closes automatically. any idea whats going on here? is this a bug? I am already on sp2.

Hello,

are you sure you have necessary SSIS components installed. Has it been working on that machine at all?

Thanks.

BI Portal connected to analysis services 2005

Anyone has connected the bi portal with analisys services 2005?

I have a security problem.

Thanks

Are you trying to troubleshoot connectivity to Analysis Services?

Is this HTTP or TCP connection?

See if this troubleshooting guide is going to help you: http://www.sqljunkies.com/WebLog/edwardm/archive/2006/05/26/21447.aspx

Edward.
--
This posting is provided "AS IS" with no warranties, and confers no rights.

|||

I had to do this a couple of days ago.

The only way I could get it to work was if I went into the advanced options on the Data Source and specified the full connection string myself, making sure to specify MSOLAP.3 as the provider. When I just chose the options from the standard connection dialog, I could not get it to work, however I was running the BI Portal for a demonstration from a virtual server so their could have been other complicating factors involved.

Note: because BI Portal uses OWC you will need to have the OLEDB provider for AS 2005 installed on all your client machines.

Hope this helps.

|||

I did what you said, i connect with the as2005 when i test the connection all work fine, i make the datasource but the webcomponents don't show the pivot table... it gives me an error and show nothing...

i don`t know what i can do... (i have the owc11 installed)

|||

Well, if the datasource tests OK then that is one hurdle we are over.

What is the exact error you are getting?

It may be something to do with the security in IE. As you may know OWC is an ActiveX control that runs on the browser and it needs to have higher than normal security rights. When you are connected to the BI Portal do you see "Trusted Zone" down in the bottom right hand corner? If so you might want to double check the rights for the Trusted zone and make sure that none of the ActiveX related permissions are disabled.

|||

When i am creating the dataosource i get this error:

The object doesn't accept this property or method.

The connection string i get is:

"Provider=MSOLAP.3;Cache Authentication=False;Persist Security Info=False;User ID=administrador;Initial Catalog=OnAlert2;Data Source=w2k3sql2005;Impersonation Level=Impersonate;Location=w2k3sql2005;Mode=ReadWrite;Protection Level=Pkt Privacy;Auto Synch Period=20000;Default Isolation Mode=0;Default MDX Visual Mode=0;MDX Compatibility=0;MDX Unique Name Style=0;Non Empty Threshold=0;SQLQueryMode=Calculated;Safety Options=1;Secured Cell Value=0;SQL Compatibility=0;Compression Level=0;Real Time Olap=False;Packet Size=4096;Initial Catalog=OnAlert2"

I am creating the connection using Analysis Services 9.0.

The pivot table doesn't load the cube i get the error, the pivot table says:

There is no details, the data provider doesn't provide more information.

Then i click in more information: Error = 0x80004005

|||I open excel to test, and i can connect to the cube.. so i don't have connections problems, the problem is with the owc..|||

I'm a bit confused, I thought you said earlier that you created and tested the connection OK, now you say that you are getting an error when you create the data source.

Why don't we try just setting the absolute minimum settings in the connection string and see if we can get a connection created without an error. Try a connection string like the following:

Provider=MSOLAP.3;Initial Catalog=OnAlert2;Data Source=w2k3sql2005;

|||

ok, how i set this connection string, now i am clicking in advanced then in create i put the name of the server etc, and it generates the connection string i can't modify it...

|||

Sorry, I don't know. I cannot enter a connection string directly into the advanced box, but if I connect first, or if I click Advanced and then browse and select a .oqy file, I can then edit the connection string.

I have not used BI Portal extensively, maybe you have an issue with your installation.

The main reason I suggested that you try a simpler connection string was the the sample you provided had the Initial Catalog setting listed twice and you had a User Id and no Password set, even though User Id only works with HTTP connections.

You could try creating an OWC page outside of BI Portal just to test the the OWC components are working. The easiest way to do this is to setup a pivot table in Excel and then choose File -> Save As Web Page and click on the add interactivity option.

This will setup a stand alone html page with an embedded pivot table, if it is running from a local drive, it will be running in the My Computer Zone, so it will probably have a different security profile than when you are running from the BI Portal, but it should atleast show us if the OWC components are working.

|||

I did what you say, i save the excel as a web page, and the owc give me an error:

"Can't process consult"

"The following system error ocurred: . "

I have the last owc...

|||

Those error messages don't really look like anything I have come across before.

If you can connect via Excel, that means your OLEDB provider is installed and working correctly.

If we cannot setup OWC in a simple html page on the local PC (I'm assuming that you save the file to a local drive, files running from network drives may have reduced priviledges) then there is little chance of it working from BI Portal.

Are you using at least IE 5?

In Internet Explorer, under Tools - Internet Options, there is an Advanced tab and under the security section there is a setting for "Enable Intergrated Windows Authentication" - Is this option ticked?

|||

I don't know what to do... perhaps i have to enable the http access?

The owc connect to the as2005 via http?

I really don't understand what is happening, i can connect with the cube via excel but when i am using the owc i have errors....

|||I have Internet explorer 6, i have the option you say enable...|||

I don't think http access will change much.

The fact that Excel works, proves that

You can authenticate to the AS server using your Windows credentials

BI Portal connected to analysis services 2005

Anyone has connected the bi portal with analisys services 2005?

I have a security problem.

Thanks

Are you trying to troubleshoot connectivity to Analysis Services?

Is this HTTP or TCP connection?

See if this troubleshooting guide is going to help you: http://www.sqljunkies.com/WebLog/edwardm/archive/2006/05/26/21447.aspx

Edward.
--
This posting is provided "AS IS" with no warranties, and confers no rights.

|||

I had to do this a couple of days ago.

The only way I could get it to work was if I went into the advanced options on the Data Source and specified the full connection string myself, making sure to specify MSOLAP.3 as the provider. When I just chose the options from the standard connection dialog, I could not get it to work, however I was running the BI Portal for a demonstration from a virtual server so their could have been other complicating factors involved.

Note: because BI Portal uses OWC you will need to have the OLEDB provider for AS 2005 installed on all your client machines.

Hope this helps.

|||

I did what you said, i connect with the as2005 when i test the connection all work fine, i make the datasource but the webcomponents don't show the pivot table... it gives me an error and show nothing...

i don`t know what i can do... (i have the owc11 installed)

|||

Well, if the datasource tests OK then that is one hurdle we are over.

What is the exact error you are getting?

It may be something to do with the security in IE. As you may know OWC is an ActiveX control that runs on the browser and it needs to have higher than normal security rights. When you are connected to the BI Portal do you see "Trusted Zone" down in the bottom right hand corner? If so you might want to double check the rights for the Trusted zone and make sure that none of the ActiveX related permissions are disabled.

|||

When i am creating the dataosource i get this error:

The object doesn't accept this property or method.

The connection string i get is:

"Provider=MSOLAP.3;Cache Authentication=False;Persist Security Info=False;User ID=administrador;Initial Catalog=OnAlert2;Data Source=w2k3sql2005;Impersonation Level=Impersonate;Location=w2k3sql2005;Mode=ReadWrite;Protection Level=Pkt Privacy;Auto Synch Period=20000;Default Isolation Mode=0;Default MDX Visual Mode=0;MDX Compatibility=0;MDX Unique Name Style=0;Non Empty Threshold=0;SQLQueryMode=Calculated;Safety Options=1;Secured Cell Value=0;SQL Compatibility=0;Compression Level=0;Real Time Olap=False;Packet Size=4096;Initial Catalog=OnAlert2"

I am creating the connection using Analysis Services 9.0.

The pivot table doesn't load the cube i get the error, the pivot table says:

There is no details, the data provider doesn't provide more information.

Then i click in more information: Error = 0x80004005

|||I open excel to test, and i can connect to the cube.. so i don't have connections problems, the problem is with the owc..|||

I'm a bit confused, I thought you said earlier that you created and tested the connection OK, now you say that you are getting an error when you create the data source.

Why don't we try just setting the absolute minimum settings in the connection string and see if we can get a connection created without an error. Try a connection string like the following:

Provider=MSOLAP.3;Initial Catalog=OnAlert2;Data Source=w2k3sql2005;

|||

ok, how i set this connection string, now i am clicking in advanced then in create i put the name of the server etc, and it generates the connection string i can't modify it...

|||

Sorry, I don't know. I cannot enter a connection string directly into the advanced box, but if I connect first, or if I click Advanced and then browse and select a .oqy file, I can then edit the connection string.

I have not used BI Portal extensively, maybe you have an issue with your installation.

The main reason I suggested that you try a simpler connection string was the the sample you provided had the Initial Catalog setting listed twice and you had a User Id and no Password set, even though User Id only works with HTTP connections.

You could try creating an OWC page outside of BI Portal just to test the the OWC components are working. The easiest way to do this is to setup a pivot table in Excel and then choose File -> Save As Web Page and click on the add interactivity option.

This will setup a stand alone html page with an embedded pivot table, if it is running from a local drive, it will be running in the My Computer Zone, so it will probably have a different security profile than when you are running from the BI Portal, but it should atleast show us if the OWC components are working.

|||

I did what you say, i save the excel as a web page, and the owc give me an error:

"Can't process consult"

"The following system error ocurred: . "

I have the last owc...

|||

Those error messages don't really look like anything I have come across before.

If you can connect via Excel, that means your OLEDB provider is installed and working correctly.

If we cannot setup OWC in a simple html page on the local PC (I'm assuming that you save the file to a local drive, files running from network drives may have reduced priviledges) then there is little chance of it working from BI Portal.

Are you using at least IE 5?

In Internet Explorer, under Tools - Internet Options, there is an Advanced tab and under the security section there is a setting for "Enable Intergrated Windows Authentication" - Is this option ticked?

|||

I don't know what to do... perhaps i have to enable the http access?

The owc connect to the as2005 via http?

I really don't understand what is happening, i can connect with the cube via excel but when i am using the owc i have errors....

|||I have Internet explorer 6, i have the option you say enable...|||

I don't think http access will change much.

The fact that Excel works, proves that

You can authenticate to the AS server using your Windows credentials

Thursday, March 8, 2012

BI Development Studio: Crashes when viewing a Cube in the Browser

Hello, I created an Analysis Services project, defined the views and the cube itself. After I deployed the cube, I wanted to compare the results with the browser.

Everytime I try to open the browser, BIDS starts loading, but after a while it crashes.

Is this a known error?

Thanks in advance!

Are you running Office 2007 on the same machine as BIDS? Is it a beta version of Office 2007?

You have not told us about the sp for SQL Server 2005 or if you run a version of Office 2003 or later on the same machine.

Try downloading SP2 CTP3 (December 2006) from www.microsoft.com/sql and see if it helps.

Regards

Thomas Ivarsson

|||

Yes, I do run Office2003 on my machine, btw it is an Acer5102-notebook. I installed the SP2 CTP3, but the error still occurs.

Any other suggestions, it is really urgent?


Thanks in advance!

|||

I hope by the word "crash" you do not mean that you get a hour glass.

1. Do you run any Firewall? I have seen some issues with Integrity where one would not be able to browse the cube.

2. Did you try browsing the cube from any other application instead of VS? like Excel

3. Also are you trying to go to the Browse tab after the cube is completely deployed. I have some instances where if you try to shift tabs while the cube is being deployed the VS crashes (shuts down completely and launches again)

See, if it is related to any of the aboce behaviors?

|||

I have never had any problem with BIDS and Office2003 since SQL Server 2005 RTM.

Try to install this add in for Excel 2003 and connect to the cube: http://www.microsoft.com/downloads/details.aspx?FamilyId=DAE82128-9F21-475D-88A4-4B6E6C069FF0&displaylang=en

Have you runned windows update and especially office update lately?

Regards

Thomas Ivarsson

BI Development Studio: Crashes when viewing a Cube in the Browser

Hello, I created an Analysis Services project, defined the views and the cube itself. After I deployed the cube, I wanted to compare the results with the browser.

Everytime I try to open the browser, BIDS starts loading, but after a while it crashes.

Is this a known error?

Thanks in advance!

Are you running Office 2007 on the same machine as BIDS? Is it a beta version of Office 2007?

You have not told us about the sp for SQL Server 2005 or if you run a version of Office 2003 or later on the same machine.

Try downloading SP2 CTP3 (December 2006) from www.microsoft.com/sql and see if it helps.

Regards

Thomas Ivarsson

|||

Yes, I do run Office2003 on my machine, btw it is an Acer5102-notebook. I installed the SP2 CTP3, but the error still occurs.

Any other suggestions, it is really urgent?


Thanks in advance!

|||

I hope by the word "crash" you do not mean that you get a hour glass.

1. Do you run any Firewall? I have seen some issues with Integrity where one would not be able to browse the cube.

2. Did you try browsing the cube from any other application instead of VS? like Excel

3. Also are you trying to go to the Browse tab after the cube is completely deployed. I have some instances where if you try to shift tabs while the cube is being deployed the VS crashes (shuts down completely and launches again)

See, if it is related to any of the aboce behaviors?

|||

I have never had any problem with BIDS and Office2003 since SQL Server 2005 RTM.

Try to install this add in for Excel 2003 and connect to the cube: http://www.microsoft.com/downloads/details.aspx?FamilyId=DAE82128-9F21-475D-88A4-4B6E6C069FF0&displaylang=en

Have you runned windows update and especially office update lately?

Regards

Thomas Ivarsson

BI Development Studio not showing "create new project" on file menu

I just installed Reporting Services 2005 but when I start up SQL
Server Business Intelligence Development Studio I'm stuck. I don't
even have "create new project" as an option on my file menu, instead I
have "create team project" or "create file".
I previously had Visual Studio with Team Foundation Server on my
machine, but I only used it to log bugs. When I installed SS2005 and
reporting services I actually uninstalled the copy of Visual Studio I
had on my machine but that hasn't helped.
I don't want to create a teams project (and I don't haver permissions
anyway), I just want to use reporting services but it seems stuck in
some TFS mode. I'd take out the TFS part if I could figure out how
to.
Any ideas? Thanks in advance.Try to uninstall the TFS and reinstall visual studio 2005
Amarnath
"doug.andersen@.gmail.com" wrote:
> I just installed Reporting Services 2005 but when I start up SQL
> Server Business Intelligence Development Studio I'm stuck. I don't
> even have "create new project" as an option on my file menu, instead I
> have "create team project" or "create file".
> I previously had Visual Studio with Team Foundation Server on my
> machine, but I only used it to log bugs. When I installed SS2005 and
> reporting services I actually uninstalled the copy of Visual Studio I
> had on my machine but that hasn't helped.
> I don't want to create a teams project (and I don't haver permissions
> anyway), I just want to use reporting services but it seems stuck in
> some TFS mode. I'd take out the TFS part if I could figure out how
> to.
> Any ideas? Thanks in advance.
>

BI and Reporting Tools used with Reporting Services

What tools (companies) have you found to work well with your Data Warehouse,
easy to use by non-developers and also interface with MS Reporting Services?
JBH1
> What tools (companies) have you found to work well with your Data
> Warehouse,
> easy to use by non-developers and also interface with MS Reporting
> Services?
*sigh* i have a thread wondering this same thing. i can't find anything. And
getting information from the actual companies is like pulling teeth.
|||> What tools (companies) have you found to work well with your Data
> Warehouse,
> easy to use by non-developers and also interface with MS Reporting
> Services?
Oh, and btw, a guy sent me a link to a site where: if you're willing to pay,
he'll let you read his reviews of various data warehouse access tools.
That's how much of a secret this stuff is.
|||I have some links related to reporting services and 3rd party companies on
Reporting Services.
You might want to have a look at the SQL 2005 Report Builder as well.
Cizer.net has some nice tools too.
Links are provided on my website .
www.dandyman.net/sql
Dandy Weyn
[MCSE-MCSA-MCDBA-MCDST-MCT]
http://www.dandyman.net
Check my SQL Server Resource Pages at http://www.dandyman.net/sql
"Jeff H." <JeffH@.discussions.microsoft.com> wrote in message
news:E5BFF01A-13CF-494A-9D90-AED28D7EC64B@.microsoft.com...
> What tools (companies) have you found to work well with your Data
> Warehouse,
> easy to use by non-developers and also interface with MS Reporting
> Services?
> --
> JBH1
|||This is sort of a broad question.
How isl the data being accessed by the end users stored?
"Jeff H." <JeffH@.discussions.microsoft.com> wrote in message
news:E5BFF01A-13CF-494A-9D90-AED28D7EC64B@.microsoft.com...
> What tools (companies) have you found to work well with your Data
> Warehouse,
> easy to use by non-developers and also interface with MS Reporting
> Services?
> --
> JBH1
|||Ian,
I am not sure many of the major BI vendors are keen to work closely
with RS...after all MSFT has now announced it's intention of 'eating
their lunch'....remember how low key the first release of RS was? And
no MSFT is 'talking it up' as a real alternative to Crystal Reports now
owned by Business Objects.
I am certainly aware that Business Objects is training up staff to
compete with RS2005.
When an 800 pound gorilla like MSFT steps into the BI marketplace all
vendors are going to do whatever they can to maintain their
clients....
All those vendors (BO, COGN, MSTR etc) will all work with SQL Server
and position their tools as somehow 'better' than Report Services....
For the last 7-8 years people have talked about integration of the
development environment between vendors, usually via some form of
metadata hub (for example Ascentials MetaStage and the MetaData
Coalition etc.). However, this has been a pretty fruitless exercise.
The BI vendors are not interested in making it easier to share their
metadata with other BI vendors. Whoever owns 'what is presented on the
screen' owns the client and all of them are trying to protect their
position from MSFTs RS efforts.
Ufortunately IT shops and business users have remained subbornly
ignorant of the fact that 80% of the work is in database design and ETL
and 20% in the presentation layer...and people buy the presentation
layer and not the infrastructure/architecture. This make 'what is on
the screen' even more important to protect.
Best Regards
Peter Nolan
www.peternolan.com
|||What do you mean by interfacing with Reporting Services? Microsoft itself
provides some good BI front tend tools for 2005.
"Jeff H." <JeffH@.discussions.microsoft.com> wrote in message
news:E5BFF01A-13CF-494A-9D90-AED28D7EC64B@.microsoft.com...
> What tools (companies) have you found to work well with your Data
> Warehouse,
> easy to use by non-developers and also interface with MS Reporting
> Services?
> --
> JBH1
|||In a data warehouse. In fact tables, surrounded by dimension tables.
"Jesse O" <jesperzz@.hotmail.com> wrote in message
news:ee5Z7SYuFHA.596@.TK2MSFTNGP12.phx.gbl...
> This is sort of a broad question.
> How isl the data being accessed by the end users stored?
>
>
> "Jeff H." <JeffH@.discussions.microsoft.com> wrote in message
> news:E5BFF01A-13CF-494A-9D90-AED28D7EC64B@.microsoft.com...
>
|||Hi JT,
which ones? I have read zip about any good tools that integrate RS into
them...we are doing this work for ourselves...
We have been surprised there is no good browser front end for RS and
that the answer we have gotten back so far is that we must build one
for ourselves...?
Thanks
Peter Nolan
www.peternolan.com
|||Depending on what you are looking for. Cizer has an interesting
web-based report builder environment. My favorite though is
SoftArtisans OfficeWriter -
http://officewriter.softartisans.com...riter-250.aspx
Because RS has a pretty complete array of rendering mechanisms, there
hasn't been a real need (at least ont he dozen or so projects I have
worked on) for a 3rd party add-on.
If you want to be a bit more specific about what problems you are
having, that you are looking for a solution too, I'll see if I can find
something.
Steve Muise
neudesic LLC

BI and Reporting Tools used with Reporting Services

What tools (companies) have you found to work well with your Data Warehouse,
easy to use by non-developers and also interface with MS Reporting Services?
--
JBH1> What tools (companies) have you found to work well with your Data
> Warehouse,
> easy to use by non-developers and also interface with MS Reporting
> Services?
*sigh* i have a thread wondering this same thing. i can't find anything. And
getting information from the actual companies is like pulling teeth.|||> What tools (companies) have you found to work well with your Data
> Warehouse,
> easy to use by non-developers and also interface with MS Reporting
> Services?
Oh, and btw, a guy sent me a link to a site where: if you're willing to pay,
he'll let you read his reviews of various data warehouse access tools.
That's how much of a secret this stuff is.|||I have some links related to reporting services and 3rd party companies on
Reporting Services.
You might want to have a look at the SQL 2005 Report Builder as well.
Cizer.net has some nice tools too.
Links are provided on my website .
www.dandyman.net/sql
Dandy Weyn
[MCSE-MCSA-MCDBA-MCDST-MCT]
http://www.dandyman.net
Check my SQL Server Resource Pages at http://www.dandyman.net/sql
"Jeff H." <JeffH@.discussions.microsoft.com> wrote in message
news:E5BFF01A-13CF-494A-9D90-AED28D7EC64B@.microsoft.com...
> What tools (companies) have you found to work well with your Data
> Warehouse,
> easy to use by non-developers and also interface with MS Reporting
> Services?
> --
> JBH1|||This is sort of a broad question.
How isl the data being accessed by the end users stored?
"Jeff H." <JeffH@.discussions.microsoft.com> wrote in message
news:E5BFF01A-13CF-494A-9D90-AED28D7EC64B@.microsoft.com...
> What tools (companies) have you found to work well with your Data
> Warehouse,
> easy to use by non-developers and also interface with MS Reporting
> Services?
> --
> JBH1|||Ian,
I am not sure many of the major BI vendors are keen to work closely
with RS...after all MSFT has now announced it's intention of 'eating
their lunch'....remember how low key the first release of RS was? And
no MSFT is 'talking it up' as a real alternative to Crystal Reports now
owned by Business Objects.
I am certainly aware that Business Objects is training up staff to
compete with RS2005.
When an 800 pound gorilla like MSFT steps into the BI marketplace all
vendors are going to do whatever they can to maintain their
clients....
All those vendors (BO, COGN, MSTR etc) will all work with SQL Server
and position their tools as somehow 'better' than Report Services....
For the last 7-8 years people have talked about integration of the
development environment between vendors, usually via some form of
metadata hub (for example Ascentials MetaStage and the MetaData
Coalition etc.). However, this has been a pretty fruitless exercise.
The BI vendors are not interested in making it easier to share their
metadata with other BI vendors. Whoever owns 'what is presented on the
screen' owns the client and all of them are trying to protect their
position from MSFTs RS efforts.
Ufortunately IT shops and business users have remained subbornly
ignorant of the fact that 80% of the work is in database design and ETL
and 20% in the presentation layer...and people buy the presentation
layer and not the infrastructure/architecture. This make 'what is on
the screen' even more important to protect.
Best Regards
Peter Nolan
www.peternolan.com|||What do you mean by interfacing with Reporting Services? Microsoft itself
provides some good BI front tend tools for 2005.
"Jeff H." <JeffH@.discussions.microsoft.com> wrote in message
news:E5BFF01A-13CF-494A-9D90-AED28D7EC64B@.microsoft.com...
> What tools (companies) have you found to work well with your Data
> Warehouse,
> easy to use by non-developers and also interface with MS Reporting
> Services?
> --
> JBH1|||In a data warehouse. In fact tables, surrounded by dimension tables.
"Jesse O" <jesperzz@.hotmail.com> wrote in message
news:ee5Z7SYuFHA.596@.TK2MSFTNGP12.phx.gbl...
> This is sort of a broad question.
> How isl the data being accessed by the end users stored?
>
>
> "Jeff H." <JeffH@.discussions.microsoft.com> wrote in message
> news:E5BFF01A-13CF-494A-9D90-AED28D7EC64B@.microsoft.com...
>|||Hi JT,
which ones? I have read zip about any good tools that integrate RS into
them...we are doing this work for ourselves...
We have been surprised there is no good browser front end for RS and
that the answer we have gotten back so far is that we must build one
for ourselves...'
Thanks
Peter Nolan
www.peternolan.com|||Depending on what you are looking for. Cizer has an interesting
web-based report builder environment. My favorite though is
SoftArtisans OfficeWriter -
http://officewriter.softartisans.co...writer-250.aspx
Because RS has a pretty complete array of rendering mechanisms, there
hasn't been a real need (at least ont he dozen or so projects I have
worked on) for a 3rd party add-on.
If you want to be a bit more specific about what problems you are
having, that you are looking for a solution too, I'll see if I can find
something.
Steve Muise
neudesic LLC

Wednesday, March 7, 2012

BI Accelerator + Applications + Report Services

Hi All,
this seems like the best place for this question.
I've had cause to review the MSFT position/tools in the BI area. I'm
surprised!!! I feel like MSFT have made MUCH more progress than generally
talked about in newsgroups and customers I work with. All I ever hear from
MSFT with respect to DW/BI is 'the next release is going to be great.' Not,
'Hey, take a look at BIA/Reporting Services'...?
Q1. In BI accelerator I installed and tested out the shopfloor application
(manufacturing) and I generated the shopfloor database from the XL Sheet.
But I can't see where it got the table definitions for the staging table and
the underlying dimensional database from. For example, where are the
table/column/datatype definitions of the staging area and the dimensional
database in the XL sheets? I'm sure I looked at all the sheets in the
spreadsheet AnalyticsBuilderWB_ShopFloorPerformance.v.0.2.xls. (But maybe
I'm going crazy and didn't). To me it only seems like the analytical model
is in the spreadsheet.
Q2. I see it generates a ton of .dts files. And when I open them up they
seem to me (a zero level skilled DTS person) to be very complex. The manual
says 'don't change the packages' which I gather means don't look, don't
worry, just run it and it will all be ok....again, I gather that the DTS
packages are somehow generated by BIA into their binary format and I believe
the source to target mappings are defined by the 'mappings' spreadsheet. But
in this spreadsheet I don't seem to find enough columns for all the columns
in the dimensional model. For example I can't find mappings for dim_emp_std
in the spreadsheet but I would have thought it should be there. (Or am I
missing something)?
Q3. In BIA I see lots about sharepoint portal and office objects inside web
pages and all that as a presentation layer. But I just took a look at
Reporting Services. Reporting Services looks absolutely fantastic for what
it is trying to do and it as very obvious how easy it will be to extend
reporting services to do MUCH, MUCH more very, very easily. RDL (Report
Definition Language) is an idea long overdue and I do believe MSFT are first
with that one. I have not heard any other vendor talk about an RDL yet. I
was amazed that a report can just be exposed as a web service and you can
call it from anywhere with anything. Now THAT is a useful thing to have.
So the question is, how come BIA seems to completely ignore Reporting
Services? I would have thought they would be very closely 'joined'. For
example, why not do the front ends for BIA apps in reporting services? I am
assuming this is possible. I'm assuming reporting services can get data out
of Analysis Server because it can get data out of any ado.net server.
(I must say I thought MSF had hidden how good Report Services is very
well...unless I'm greatly mistaken, I've only had a few hours to look at it,
it looks like a really, really useful product!!!)
Anyway, thanks in advance if anyone can let me know what I am missing in
BIA.
Best Regards
Peter Nolan
www.peternolan.com
Wow. There is a lot of info here. A couple of comments:
Q1) datatypes using in BIA -- well, we make several guesses. First we know
that all measures are numeric (that is a requirement of Analysis Services).
Second we know that fields that we generate have specific uses -- and from
that we know their datatypes. For example, the surrogate keys are integers
(as you would expect because they are identity). For those fields which are
user supplies, e.g. member names -- we know just treat them as "names", i.e.
varchars. If you have something different, e.g. names which are really
integers, they will have to re-do things by-hand. For those things which
could be any datatype, e.g. member properties, you can choice the
appropriate datatypes.
If there are specific tables and columns you are wondering about just tell
me and I'd be glad to explain why we did something one way or the other.
What you are seeing the whole point of the BI Accelerator. You layout the
logical multidimensional design and we auto-generate a relational staging
area; final data mart, OLAP structures, and DTS packages to move the data
through it. Logical data model to final app in one click :-)
And that final app should have all of the tips and tricks that you would see
in a production system; not just a rough proof-of-concept system.
Q2) DTS packages in BIA -- thank you. I wrote the generator code and I am
quite proud of them.
It generates a ton of packages because of the type of schema we choose to
implement. Since we generate a snowflake schema, there are tables for every
dimension and level in the dimensional structure.
We had several goals with the DTS packages. First, we wanted them to be
data-driven because we don't expect everyone to be an expert on DTS. Thus
rather than having to make changes to them, most of them have variables you
can change value and have the package do something different. All of this is
documented in the PAG (online doc set). One of the challenges of this was
that DTS with SQL Server 2000 doesn't have the control flow tasks needed to
make this declaritive -- with 2005, we put that all in natively and similar
packages with 2005 would be quite a bit less complex.
Second, we wanted the packages to be visible and extensiveable by
knowledgeable users. Nothing is hidden -- it is all up front and in your
face. Yes, a novice will look at them and dispare -- but don't give up!
There is documentation in the PAG provided for all of them! Lastly you might
be interested in this white paper which talks about the DTS packages and
provides various tips and tricks beyond what the PAG has in it.
http://msdn.microsoft.com/library/de..._dts_ssabi.asp
If there is a specific step you have questions about, I'd be glad to help
also.
Q3) I am glad you like Reporting Services. I agree it is a fantastic tool.
However, it is a totally different product. BI Accelerator is a BI
application generator; not a report generator. It has a client component
only for customization. The idea was that suppose you had a report called a
"template" which was exactly the same from client to client, but one
customer called products "items" and another called products "books".
Wouldn't it be nice to ship a template along with the multidimensional
design and have the system automatically rename products to items or
products to books. That is what the client generator does with the BI
Accelerator. It ships with a Proclarity component which knows how to go
inside a Proclarity briefing book and replace one tag for another tag. A
similar facility is available from Panorama and there is an API available to
other vendors if they would like to plug into the BI Accelerator client
generator.
Hope this helps.
Dave Wickert [MSFT]
dwickert@.online.microsoft.com
Program Manager
BI SystemsTeam
SQL BI Product Unit (Analysis Services)
This posting is provided "AS IS" with no warranties, and confers no rights.
"Peter Nolan" <peter@.peternolan.com> wrote in message
news:uCEWHDlBFHA.2876@.TK2MSFTNGP12.phx.gbl...
> Hi All,
> this seems like the best place for this question.
> I've had cause to review the MSFT position/tools in the BI area. I'm
> surprised!!! I feel like MSFT have made MUCH more progress than generally
> talked about in newsgroups and customers I work with. All I ever hear
from
> MSFT with respect to DW/BI is 'the next release is going to be great.'
Not,
> 'Hey, take a look at BIA/Reporting Services'...?
> Q1. In BI accelerator I installed and tested out the shopfloor application
> (manufacturing) and I generated the shopfloor database from the XL Sheet.
> But I can't see where it got the table definitions for the staging table
and
> the underlying dimensional database from. For example, where are the
> table/column/datatype definitions of the staging area and the dimensional
> database in the XL sheets? I'm sure I looked at all the sheets in the
> spreadsheet AnalyticsBuilderWB_ShopFloorPerformance.v.0.2.xls. (But maybe
> I'm going crazy and didn't). To me it only seems like the analytical
model
> is in the spreadsheet.
>
> Q2. I see it generates a ton of .dts files. And when I open them up they
> seem to me (a zero level skilled DTS person) to be very complex. The
manual
> says 'don't change the packages' which I gather means don't look, don't
> worry, just run it and it will all be ok....again, I gather that the DTS
> packages are somehow generated by BIA into their binary format and I
believe
> the source to target mappings are defined by the 'mappings' spreadsheet.
But
> in this spreadsheet I don't seem to find enough columns for all the
columns
> in the dimensional model. For example I can't find mappings for
dim_emp_std
> in the spreadsheet but I would have thought it should be there. (Or am I
> missing something)?
>
> Q3. In BIA I see lots about sharepoint portal and office objects inside
web
> pages and all that as a presentation layer. But I just took a look at
> Reporting Services. Reporting Services looks absolutely fantastic for
what
> it is trying to do and it as very obvious how easy it will be to extend
> reporting services to do MUCH, MUCH more very, very easily. RDL (Report
> Definition Language) is an idea long overdue and I do believe MSFT are
first
> with that one. I have not heard any other vendor talk about an RDL yet.
I
> was amazed that a report can just be exposed as a web service and you can
> call it from anywhere with anything. Now THAT is a useful thing to have.
> So the question is, how come BIA seems to completely ignore Reporting
> Services? I would have thought they would be very closely 'joined'. For
> example, why not do the front ends for BIA apps in reporting services? I
am
> assuming this is possible. I'm assuming reporting services can get data
out
> of Analysis Server because it can get data out of any ado.net server.
> (I must say I thought MSF had hidden how good Report Services is very
> well...unless I'm greatly mistaken, I've only had a few hours to look at
it,
> it looks like a really, really useful product!!!)
> Anyway, thanks in advance if anyone can let me know what I am missing in
> BIA.
> Best Regards
> Peter Nolan
> www.peternolan.com
>
|||Hi Dave,
wow, nice to see the guy who 'wrote the code' is watching the forum.
Thanks for your feedback. Makes sense and I can see where BIA could
head in the next release.
You'll be pleased to hear this...the reason for my interest is some
colleagues and I are bringing to market a product that will be based
solely on MSFT BI technologies and part of that is we are researching
everything we can find about MSFT BI and the way forward with MSFT
BI.....I haven't seen any kind of comprehensive suite of slides that
is public in this matter...if you hapeen to know where one can be
downloaded from that would be great.
I stall have to figure out if BIA can fit into what we are doing. We
shall see.
My other comment...
"Q2) DTS packages in BIA -- thank you. I wrote the generator =ADcode and
I am
quite proud of them."
Let's just say I recognise a good idea and a smart guy when I see
one...;-)
One of my specialities is ETL and reducing the cost of writing ETL for
my clients. I've been doing DW for 14 years now and ETL consumes so
much of the money it's the obvious thing to keep working on to reduce.
About 2.5 years ago I investigated the possibility of writing a
generator for ETL jobs in Informatica and DataStage. I found it was
certainly possible to do so. The 'problem' was that it would be
impossible for an 'independent' to make any money out of it. If the
idea did well it would be trivial for the vendor to write the same
functionality and release it. My generator was planned to produce XML
as both INFA/DS can export/import jobs in XML format.
So I was impressed that you have come up with the same idea and from
looking at the time frames you must have had that idea before me...well
done..;-)...And that you are able to produce the DTS jobs in binary.
(Though I'm sure being MSFT you can see the source code/structures for
DTS.)
Me, I changed course after my evaluation and wrote my own ETL tool. It
is very productive. Rather than map fields at field level it maps at
table levels and moves fields within tables on matching column names.
So it is not actually necessary to ever define source to target
mappings anywhere. It discovers the column names at run time so that
when more columns are added there is no code to regenerate or change
and it is the changing of ETL code that is expensive in DW maintenance.
This is another reason why I was impressed with what you have
done....yours is the first example I have seen where it might be
possible to get away with ETL changes without re-testing in the
non-production environment while still using a 'real' ETL tool.
We have also settled on a spreadsheet as the way to record all
mappings. But we had not automated the generation of jobs/code because
even on very large projects the amount of time taken is very small. (I
recently built the ETL for a staging area+DW with 100 tables and 3,000
fields in 2 weeks so there didn't seem to be a need to speed it up even
further.) So your idea of putting a 'generate' button in the
spreadsheet and a whole set of ticks on what to generate out of the
spreadsheet also intrigued me. It looks like a really 'neat trick'. I
am an 'XL-dummy' so that thought had not crossed my mind before. I must
ask one of my colleagues how that was done......The thought has
crossed my mind that our spreadsheet could be extended to record other
information required for ETL generation and we could cut even the 2
weeks work we do now out by generating what we need to generate...
So, like I said, I just recognised a good idea and a smart guy...;-)
All the best...I'll be investigating BIA some more and look forward to
seeing what happens next...
Best Regards
Peter Nolan
www.peternolan.com
|||Actually the DTS generator uses the normal SQL Server APIs to create its
packages. All I did was to create the packages that I wanted; then saved the
package as VB. Then I reversed engineered the VB back into production
quality code. We never had to look at the source code for DTS at all.
Dave Wickert [MSFT]
dwickert@.online.microsoft.com
Program Manager
BI SystemsTeam
SQL BI Product Unit (Analysis Services)
This posting is provided "AS IS" with no warranties, and confers no rights.
"Peter Nolan" <peter@.peternolan.com> wrote in message
news:1107346392.279399.284200@.f14g2000cwb.googlegr oups.com...
Hi Dave,
wow, nice to see the guy who 'wrote the code' is watching the forum.
Thanks for your feedback. Makes sense and I can see where BIA could
head in the next release.
You'll be pleased to hear this...the reason for my interest is some
colleagues and I are bringing to market a product that will be based
solely on MSFT BI technologies and part of that is we are researching
everything we can find about MSFT BI and the way forward with MSFT
BI.....I haven't seen any kind of comprehensive suite of slides that
is public in this matter...if you hapeen to know where one can be
downloaded from that would be great.
I stall have to figure out if BIA can fit into what we are doing. We
shall see.
My other comment...
"Q2) DTS packages in BIA -- thank you. I wrote the generator Xcode and
I am
quite proud of them."
Let's just say I recognise a good idea and a smart guy when I see
one...;-)
One of my specialities is ETL and reducing the cost of writing ETL for
my clients. I've been doing DW for 14 years now and ETL consumes so
much of the money it's the obvious thing to keep working on to reduce.
About 2.5 years ago I investigated the possibility of writing a
generator for ETL jobs in Informatica and DataStage. I found it was
certainly possible to do so. The 'problem' was that it would be
impossible for an 'independent' to make any money out of it. If the
idea did well it would be trivial for the vendor to write the same
functionality and release it. My generator was planned to produce XML
as both INFA/DS can export/import jobs in XML format.
So I was impressed that you have come up with the same idea and from
looking at the time frames you must have had that idea before me...well
done..;-)...And that you are able to produce the DTS jobs in binary.
(Though I'm sure being MSFT you can see the source code/structures for
DTS.)
Me, I changed course after my evaluation and wrote my own ETL tool. It
is very productive. Rather than map fields at field level it maps at
table levels and moves fields within tables on matching column names.
So it is not actually necessary to ever define source to target
mappings anywhere. It discovers the column names at run time so that
when more columns are added there is no code to regenerate or change
and it is the changing of ETL code that is expensive in DW maintenance.
This is another reason why I was impressed with what you have
done....yours is the first example I have seen where it might be
possible to get away with ETL changes without re-testing in the
non-production environment while still using a 'real' ETL tool.
We have also settled on a spreadsheet as the way to record all
mappings. But we had not automated the generation of jobs/code because
even on very large projects the amount of time taken is very small. (I
recently built the ETL for a staging area+DW with 100 tables and 3,000
fields in 2 weeks so there didn't seem to be a need to speed it up even
further.) So your idea of putting a 'generate' button in the
spreadsheet and a whole set of ticks on what to generate out of the
spreadsheet also intrigued me. It looks like a really 'neat trick'. I
am an 'XL-dummy' so that thought had not crossed my mind before. I must
ask one of my colleagues how that was done......The thought has
crossed my mind that our spreadsheet could be extended to record other
information required for ETL generation and we could cut even the 2
weeks work we do now out by generating what we need to generate...
So, like I said, I just recognised a good idea and a smart guy...;-)
All the best...I'll be investigating BIA some more and look forward to
seeing what happens next...
Best Regards
Peter Nolan
www.peternolan.com
|||Hi Dave,
interesting...I am not up on DTS and I had never heard that the
packages could be saved to VB...I must look into DTS when the next
release comes out.....
I've ask one of my XL knowledgable guys how to put something like a
'generate' button into the spreadsheets we use to write our mappings
and to call some C++ code we have......In fact, we type the
definitions into a spreadsheet and then drop them into the database to
run the code to generate the tables/views!!! (LOL) This is when we
aren't using a data modelling tool. Most places have a standard tool
that must be used for table definitions...
Best Regards
Peter Nolan

BI Accelerator + Applications + Report Services

Hi All,
this seems like the best place for this question.
I've had cause to review the MSFT position/tools in the BI area. I'm
surprised!!! I feel like MSFT have made MUCH more progress than generally
talked about in newsgroups and customers I work with. All I ever hear from
MSFT with respect to DW/BI is 'the next release is going to be great.' Not,
'Hey, take a look at BIA/Reporting Services'...'
Q1. In BI accelerator I installed and tested out the shopfloor application
(manufacturing) and I generated the shopfloor database from the XL Sheet.
But I can't see where it got the table definitions for the staging table and
the underlying dimensional database from. For example, where are the
table/column/datatype definitions of the staging area and the dimensional
database in the XL sheets? I'm sure I looked at all the sheets in the
spreadsheet AnalyticsBuilderWB_ShopFloorPerformance.v.0.2.xls. (But maybe
I'm going crazy and didn't). To me it only seems like the analytical model
is in the spreadsheet.
Q2. I see it generates a ton of .dts files. And when I open them up they
seem to me (a zero level skilled DTS person) to be very complex. The manual
says 'don't change the packages' which I gather means don't look, don't
worry, just run it and it will all be ok....again, I gather that the DTS
packages are somehow generated by BIA into their binary format and I believe
the source to target mappings are defined by the 'mappings' spreadsheet. But
in this spreadsheet I don't seem to find enough columns for all the columns
in the dimensional model. For example I can't find mappings for dim_emp_std
in the spreadsheet but I would have thought it should be there. (Or am I
missing something)'
Q3. In BIA I see lots about sharepoint portal and office objects inside web
pages and all that as a presentation layer. But I just took a look at
Reporting Services. Reporting Services looks absolutely fantastic for what
it is trying to do and it as very obvious how easy it will be to extend
reporting services to do MUCH, MUCH more very, very easily. RDL (Report
Definition Language) is an idea long overdue and I do believe MSFT are first
with that one. I have not heard any other vendor talk about an RDL yet. I
was amazed that a report can just be exposed as a web service and you can
call it from anywhere with anything. Now THAT is a useful thing to have.
So the question is, how come BIA seems to completely ignore Reporting
Services? I would have thought they would be very closely 'joined'. For
example, why not do the front ends for BIA apps in reporting services? I am
assuming this is possible. I'm assuming reporting services can get data out
of Analysis Server because it can get data out of any ado.net server.
(I must say I thought MSF had hidden how good Report Services is very
well...unless I'm greatly mistaken, I've only had a few hours to look at it,
it looks like a really, really useful product!!!)
Anyway, thanks in advance if anyone can let me know what I am missing in
BIA.
Best Regards
Peter Nolan
www.peternolan.comWow. There is a lot of info here. A couple of comments:
Q1) datatypes using in BIA -- well, we make several guesses. First we know
that all measures are numeric (that is a requirement of Analysis Services).
Second we know that fields that we generate have specific uses -- and from
that we know their datatypes. For example, the surrogate keys are integers
(as you would expect because they are identity). For those fields which are
user supplies, e.g. member names -- we know just treat them as "names", i.e.
varchars. If you have something different, e.g. names which are really
integers, they will have to re-do things by-hand. For those things which
could be any datatype, e.g. member properties, you can choice the
appropriate datatypes.
If there are specific tables and columns you are wondering about just tell
me and I'd be glad to explain why we did something one way or the other.
What you are seeing the whole point of the BI Accelerator. You layout the
logical multidimensional design and we auto-generate a relational staging
area; final data mart, OLAP structures, and DTS packages to move the data
through it. Logical data model to final app in one click :-)
And that final app should have all of the tips and tricks that you would see
in a production system; not just a rough proof-of-concept system.
Q2) DTS packages in BIA -- thank you. I wrote the generator code and I am
quite proud of them.
It generates a ton of packages because of the type of schema we choose to
implement. Since we generate a snowflake schema, there are tables for every
dimension and level in the dimensional structure.
We had several goals with the DTS packages. First, we wanted them to be
data-driven because we don't expect everyone to be an expert on DTS. Thus
rather than having to make changes to them, most of them have variables you
can change value and have the package do something different. All of this is
documented in the PAG (online doc set). One of the challenges of this was
that DTS with SQL Server 2000 doesn't have the control flow tasks needed to
make this declaritive -- with 2005, we put that all in natively and similar
packages with 2005 would be quite a bit less complex.
Second, we wanted the packages to be visible and extensiveable by
knowledgeable users. Nothing is hidden -- it is all up front and in your
face. Yes, a novice will look at them and dispare -- but don't give up!
There is documentation in the PAG provided for all of them! Lastly you might
be interested in this white paper which talks about the DTS packages and
provides various tips and tricks beyond what the PAG has in it.
http://msdn.microsoft.com/library/d...
dts_ssabi.asp
If there is a specific step you have questions about, I'd be glad to help
also.
Q3) I am glad you like Reporting Services. I agree it is a fantastic tool.
However, it is a totally different product. BI Accelerator is a BI
application generator; not a report generator. It has a client component
only for customization. The idea was that suppose you had a report called a
"template" which was exactly the same from client to client, but one
customer called products "items" and another called products "books".
Wouldn't it be nice to ship a template along with the multidimensional
design and have the system automatically rename products to items or
products to books. That is what the client generator does with the BI
Accelerator. It ships with a Proclarity component which knows how to go
inside a Proclarity briefing book and replace one tag for another tag. A
similar facility is available from Panorama and there is an API available to
other vendors if they would like to plug into the BI Accelerator client
generator.
Hope this helps.
--
Dave Wickert [MSFT]
dwickert@.online.microsoft.com
Program Manager
BI SystemsTeam
SQL BI Product Unit (Analysis Services)
--
This posting is provided "AS IS" with no warranties, and confers no rights.
"Peter Nolan" <peter@.peternolan.com> wrote in message
news:uCEWHDlBFHA.2876@.TK2MSFTNGP12.phx.gbl...
> Hi All,
> this seems like the best place for this question.
> I've had cause to review the MSFT position/tools in the BI area. I'm
> surprised!!! I feel like MSFT have made MUCH more progress than generally
> talked about in newsgroups and customers I work with. All I ever hear
from
> MSFT with respect to DW/BI is 'the next release is going to be great.'
Not,
> 'Hey, take a look at BIA/Reporting Services'...'
> Q1. In BI accelerator I installed and tested out the shopfloor application
> (manufacturing) and I generated the shopfloor database from the XL Sheet.
> But I can't see where it got the table definitions for the staging table
and
> the underlying dimensional database from. For example, where are the
> table/column/datatype definitions of the staging area and the dimensional
> database in the XL sheets? I'm sure I looked at all the sheets in the
> spreadsheet AnalyticsBuilderWB_ShopFloorPerformance.v.0.2.xls. (But maybe
> I'm going crazy and didn't). To me it only seems like the analytical
model
> is in the spreadsheet.
>
> Q2. I see it generates a ton of .dts files. And when I open them up they
> seem to me (a zero level skilled DTS person) to be very complex. The
manual
> says 'don't change the packages' which I gather means don't look, don't
> worry, just run it and it will all be ok....again, I gather that the DTS
> packages are somehow generated by BIA into their binary format and I
believe
> the source to target mappings are defined by the 'mappings' spreadsheet.
But
> in this spreadsheet I don't seem to find enough columns for all the
columns
> in the dimensional model. For example I can't find mappings for
dim_emp_std
> in the spreadsheet but I would have thought it should be there. (Or am I
> missing something)'
>
> Q3. In BIA I see lots about sharepoint portal and office objects inside
web
> pages and all that as a presentation layer. But I just took a look at
> Reporting Services. Reporting Services looks absolutely fantastic for
what
> it is trying to do and it as very obvious how easy it will be to extend
> reporting services to do MUCH, MUCH more very, very easily. RDL (Report
> Definition Language) is an idea long overdue and I do believe MSFT are
first
> with that one. I have not heard any other vendor talk about an RDL yet.
I
> was amazed that a report can just be exposed as a web service and you can
> call it from anywhere with anything. Now THAT is a useful thing to have.
> So the question is, how come BIA seems to completely ignore Reporting
> Services? I would have thought they would be very closely 'joined'. For
> example, why not do the front ends for BIA apps in reporting services? I
am
> assuming this is possible. I'm assuming reporting services can get data
out
> of Analysis Server because it can get data out of any ado.net server.
> (I must say I thought MSF had hidden how good Report Services is very
> well...unless I'm greatly mistaken, I've only had a few hours to look at
it,
> it looks like a really, really useful product!!!)
> Anyway, thanks in advance if anyone can let me know what I am missing in
> BIA.
> Best Regards
> Peter Nolan
> www.peternolan.com
>|||Hi Dave,
wow, nice to see the guy who 'wrote the code' is watching the forum.
Thanks for your feedback. Makes sense and I can see where BIA could
head in the next release.
You'll be pleased to hear this...the reason for my interest is some
colleagues and I are bringing to market a product that will be based
solely on MSFT BI technologies and part of that is we are researching
everything we can find about MSFT BI and the way forward with MSFT
BI.....I haven't seen any kind of comprehensive suite of slides that
is public in this matter...if you hapeen to know where one can be
downloaded from that would be great.
I stall have to figure out if BIA can fit into what we are doing. We
shall see.
My other comment...
"Q2) DTS packages in BIA -- thank you. I wrote the generator =ADcode and
I am
quite proud of them."
Let's just say I recognise a good idea and a smart guy when I see
one...;-)
One of my specialities is ETL and reducing the cost of writing ETL for
my clients. I've been doing DW for 14 years now and ETL consumes so
much of the money it's the obvious thing to keep working on to reduce.
About 2.5 years ago I investigated the possibility of writing a
generator for ETL jobs in Informatica and DataStage. I found it was
certainly possible to do so. The 'problem' was that it would be
impossible for an 'independent' to make any money out of it. If the
idea did well it would be trivial for the vendor to write the same
functionality and release it. My generator was planned to produce XML
as both INFA/DS can export/import jobs in XML format.
So I was impressed that you have come up with the same idea and from
looking at the time frames you must have had that idea before me...well
done..;-)...And that you are able to produce the DTS jobs in binary.
(Though I'm sure being MSFT you can see the source code/structures for
DTS.)
Me, I changed course after my evaluation and wrote my own ETL tool. It
is very productive. Rather than map fields at field level it maps at
table levels and moves fields within tables on matching column names.
So it is not actually necessary to ever define source to target
mappings anywhere. It discovers the column names at run time so that
when more columns are added there is no code to regenerate or change
and it is the changing of ETL code that is expensive in DW maintenance.
This is another reason why I was impressed with what you have
done....yours is the first example I have seen where it might be
possible to get away with ETL changes without re-testing in the
non-production environment while still using a 'real' ETL tool.
We have also settled on a spreadsheet as the way to record all
mappings. But we had not automated the generation of jobs/code because
even on very large projects the amount of time taken is very small. (I
recently built the ETL for a staging area+DW with 100 tables and 3,000
fields in 2 weeks so there didn't seem to be a need to speed it up even
further.) So your idea of putting a 'generate' button in the
spreadsheet and a whole set of ticks on what to generate out of the
spreadsheet also intrigued me. It looks like a really 'neat trick'. I
am an 'XL-dummy' so that thought had not crossed my mind before. I must
ask one of my colleagues how that was done......The thought has
crossed my mind that our spreadsheet could be extended to record other
information required for ETL generation and we could cut even the 2
weeks work we do now out by generating what we need to generate...
So, like I said, I just recognised a good idea and a smart guy...;-)
All the best...I'll be investigating BIA some more and look forward to
seeing what happens next...
Best Regards
Peter Nolan
www.peternolan.com|||Actually the DTS generator uses the normal SQL Server APIs to create its
packages. All I did was to create the packages that I wanted; then saved the
package as VB. Then I reversed engineered the VB back into production
quality code. We never had to look at the source code for DTS at all.
--
Dave Wickert [MSFT]
dwickert@.online.microsoft.com
Program Manager
BI SystemsTeam
SQL BI Product Unit (Analysis Services)
--
This posting is provided "AS IS" with no warranties, and confers no rights.
"Peter Nolan" <peter@.peternolan.com> wrote in message
news:1107346392.279399.284200@.f14g2000cwb.googlegroups.com...
Hi Dave,
wow, nice to see the guy who 'wrote the code' is watching the forum.
Thanks for your feedback. Makes sense and I can see where BIA could
head in the next release.
You'll be pleased to hear this...the reason for my interest is some
colleagues and I are bringing to market a product that will be based
solely on MSFT BI technologies and part of that is we are researching
everything we can find about MSFT BI and the way forward with MSFT
BI.....I haven't seen any kind of comprehensive suite of slides that
is public in this matter...if you hapeen to know where one can be
downloaded from that would be great.
I stall have to figure out if BIA can fit into what we are doing. We
shall see.
My other comment...
"Q2) DTS packages in BIA -- thank you. I wrote the generator _code and
I am
quite proud of them."
Let's just say I recognise a good idea and a smart guy when I see
one...;-)
One of my specialities is ETL and reducing the cost of writing ETL for
my clients. I've been doing DW for 14 years now and ETL consumes so
much of the money it's the obvious thing to keep working on to reduce.
About 2.5 years ago I investigated the possibility of writing a
generator for ETL jobs in Informatica and DataStage. I found it was
certainly possible to do so. The 'problem' was that it would be
impossible for an 'independent' to make any money out of it. If the
idea did well it would be trivial for the vendor to write the same
functionality and release it. My generator was planned to produce XML
as both INFA/DS can export/import jobs in XML format.
So I was impressed that you have come up with the same idea and from
looking at the time frames you must have had that idea before me...well
done..;-)...And that you are able to produce the DTS jobs in binary.
(Though I'm sure being MSFT you can see the source code/structures for
DTS.)
Me, I changed course after my evaluation and wrote my own ETL tool. It
is very productive. Rather than map fields at field level it maps at
table levels and moves fields within tables on matching column names.
So it is not actually necessary to ever define source to target
mappings anywhere. It discovers the column names at run time so that
when more columns are added there is no code to regenerate or change
and it is the changing of ETL code that is expensive in DW maintenance.
This is another reason why I was impressed with what you have
done....yours is the first example I have seen where it might be
possible to get away with ETL changes without re-testing in the
non-production environment while still using a 'real' ETL tool.
We have also settled on a spreadsheet as the way to record all
mappings. But we had not automated the generation of jobs/code because
even on very large projects the amount of time taken is very small. (I
recently built the ETL for a staging area+DW with 100 tables and 3,000
fields in 2 weeks so there didn't seem to be a need to speed it up even
further.) So your idea of putting a 'generate' button in the
spreadsheet and a whole set of ticks on what to generate out of the
spreadsheet also intrigued me. It looks like a really 'neat trick'. I
am an 'XL-dummy' so that thought had not crossed my mind before. I must
ask one of my colleagues how that was done......The thought has
crossed my mind that our spreadsheet could be extended to record other
information required for ETL generation and we could cut even the 2
weeks work we do now out by generating what we need to generate...
So, like I said, I just recognised a good idea and a smart guy...;-)
All the best...I'll be investigating BIA some more and look forward to
seeing what happens next...
Best Regards
Peter Nolan
www.peternolan.com|||Hi Dave,
interesting...I am not up on DTS and I had never heard that the
packages could be saved to VB...I must look into DTS when the next
release comes out.....
I've ask one of my XL knowledgable guys how to put something like a
'generate' button into the spreadsheets we use to write our mappings
and to call some C++ code we have......In fact, we type the
definitions into a spreadsheet and then drop them into the database to
run the code to generate the tables/views!!! (LOL) This is when we
aren't using a data modelling tool. Most places have a standard tool
that must be used for table definitions...
Best Regards
Peter Nolan

Saturday, February 25, 2012

Better MDX Query in 2005?

I've worked on several analysis services and reporting services solutions but this has me stumped. I have an X,Y scatter chart in RS that needs to hit a cube. So I need an MDX query that returns something like this:

DiseaseAbrev Pathway Feasibility
ADD 2.7 1.9
XS 4.0 1.0
YYY 1.4 2.0

The goal is to have two columns, Pathway and Feasibility, that contain a weighted measure that is calculated by summing a set of other measures times a weighting factor. The Disease Score Type dimension has a parent-child hierarchy set up so that Pathway and Feasibility are two members who roll up the weighted leaf members. When applied to the Score measure (see query below) this gives me what I want.

SELECT {[Disease Score Type].[Parent Id].&[2],[Disease Score Type].[Parent Id].&[1]} ON COLUMNS,
[Disease].[Disease Abbrev].[Disease Abbrev] ON ROWS
FROM [IVDB]
WHERE ([Measures].[Score], [Disease Selection].[Short List].&[Short List])

But, and this is a big problem, Reporting Services requires measures and ONLY measures in the first axis. So this does not work, which is really frustrating since this is really the most elegant way of modelling the problem domain. How can I write a better query that returns a Pathway and a Feasibility column containing weighted sums of the Score measure for just those two dimension members?

Can someone please help me?

There is nothing really wrong with your MDX, it is an unreasonable limitation of Reporting Services which has been raised by a few people, including fellow MVP Chris Webb here: http://cwebbbi.spaces.live.com/Blog/cns!1pi7ETChsJ1un_2s41jm9Iyg!163.entry

A work around is to create a an OLE DB datasource in RS2005 then choose the "Microsoft OLE DB Provider for Analysis Services 9.0" provider, you don't get the fancy designer and have to enter your MDX by hand, but you can structure your query any way you like.

|||

Thank you! The work around is fine for this report (and the dozen or so left to do...). I don't need the Query Builder in Reporting Services since I usually wind up writing the MDX directly to get what I want. And if I want a fancy designer the one in SQL Management Studio works very well. The only thing you really give up this way is direct support for parameters etc - but I can use the same tricks as the 'bad-old-days' of Analysis Services 2000 where you could fake parameters by using string functions to build the MDX!

It's just too bad that 2005 has a set of restrictions that make it's new features useless for anything but the most trivial MDX.

Thank you again!

Micah

|||

One way of working around the Analysis Services Provider limitation in this case is to use calculated measures, like:

With

Member [Measures].[PathwayScore] as

([Measures].[Score], [Disease Score Type].[Parent Id].&[2])

Member [Measures].[FeasibilityScore] as

([Measures].[Score], [Disease Score Type].[Parent Id].&[1])

SELECT {[Measures].[PathwayScore], [Measures].[FeasibilityScore]} ON COLUMNS,
[Disease].[Disease Abbrev].[Disease Abbrev] ON ROWS
FROM [IVDB]
WHERE ([Disease Selection].[Short List].&[Short List])

Better Charting capability

Has anyone heard as to whether Microsoft plans on updating the charting engine for Reporting Services? Right now I find the charting capability rather limited and would like to see richer charting capability. Is this a case where it is better to not wait for this feature as it will be way off into the future before this happens, or better charting features just are not in the scope of Microsoft's plans for Reporting Services. If either case is true then the best solution would be to purchase the full capabilities of Dundas Chart for Reporting Services.

if you can wait for the next releas(es) - microsoft acquired the data visualization products of dundas

read this article

http://blogs.msdn.com/bwelcker/archive/2007/06/04/dreamy-reporting-services-at-teched.aspx

regards

andreas

Sunday, February 19, 2012

Best way to stop MS SQL in a Failover Cluster?

Hi,
Is it best practice to use SEM or the Services applet to stop SQL2000 on a
failover cluster or is there a way in Cluster Admin?
Thanks
Chris Wood
Alberta Department of Energy
CANADA
Hi,
You should use Cluter Administrator Console to take the SQL Server resource
Offline.
Shutting SQL Service in SEM will initiate a failover.
Danijel
"Chris Wood" <anonymous@.discussions.microsoft.com> wrote in message
news:ei5JlFx$EHA.2568@.TK2MSFTNGP10.phx.gbl...
> Hi,
> Is it best practice to use SEM or the Services applet to stop SQL2000 on a
> failover cluster or is there a way in Cluster Admin?
> Thanks
> Chris Wood
> Alberta Department of Energy
> CANADA
>
|||Danijel,
But you could use Services to stop SQL?
Chris
"Danijel Novak" <danijel.novak@.snt.si> wrote in message
news:GDRHd.8418$F6.1490871@.news.siol.net...
> Hi,
> You should use Cluter Administrator Console to take the SQL Server
> resource Offline.
> Shutting SQL Service in SEM will initiate a failover.
> Danijel
> "Chris Wood" <anonymous@.discussions.microsoft.com> wrote in message
> news:ei5JlFx$EHA.2568@.TK2MSFTNGP10.phx.gbl...
>
|||Nope, it would initiate a failover too...
Best practice is to use Cluster Administrator to shutdown the service.
Danijel
"Chris Wood" <anonymous@.discussions.microsoft.com> wrote in message
news:%23JoVHcx$EHA.1084@.tk2msftngp13.phx.gbl...
> Danijel,
> But you could use Services to stop SQL?
> Chris
> "Danijel Novak" <danijel.novak@.snt.si> wrote in message
> news:GDRHd.8418$F6.1490871@.news.siol.net...
>
|||That's what I wanted to know.
Thanks
Chris
"Danijel Novak" <danijel.novak@.snt.si> wrote in message
news:cdSHd.8423$F6.1490201@.news.siol.net...
> Nope, it would initiate a failover too...
> Best practice is to use Cluster Administrator to shutdown the service.
> Danijel
>
> "Chris Wood" <anonymous@.discussions.microsoft.com> wrote in message
> news:%23JoVHcx$EHA.1084@.tk2msftngp13.phx.gbl...
>
|||As of SQL Server 2000 Failover Clustering, you should be able to use SQL Enterprise Manager, SQL Server Services Manager, Query Analyzer and Cluster Administrator to start/stop SQL Server resources as all
these are cluster aware. This was not true for SQL Server 7.0 Failover Clustering and in that case only Cluster Administrator had to be used to start/stop the SQL Server Services.
Additional Information:
======================
INF: Clustered SQL Server Do's, Don'ts, and Basic Warnings
http://support.microsoft.com/?kbid=254321
Here is the relevant section from the above mentioned Microsoft Knowledge Base article
================================================== ================================================== ========================
Start and stop SQL Server services
SQL Server 6.5 and SQL Server 7.0 virtual servers
To start or stop SQL Server, SQL Server Executive, or SQL Agent services from a SQL Server 6.5 or SQL Server 7.0 virtual server, you must use the Microsoft Cluster Administrator or the Cluster.exe command line
tool.If you attempt to start or stop services in any other way (for instance, from Control Panel, SQL Service Manager, or SQL Enterprise Manager), the registry may be corrupted, and you may need to uncluster or
completely reinstall SQL Server.The most common sign of having started a service incorrectly is that the service accounts appear as a jumble of ASCII characters.If you need to start SQL Server from a command
line, you must use the Cluster Administrator or Cluster.exe tool to first take the SQL Server, SQL Executive, or SQL Agent services offline.When you start SQL Server from a command line, connectivity takes place
using the virtual server name. The only way to make a local connection is if the resources are owned by the node from which you originally installed SQL Server.
SQL Server 2000 virtual servers
SQL Server 2000 virtual servers do not have the above restrictions. We recommend that you use SQL Server Enterprise Manager, SQL Server Services applet, or Cluster Administrator to start and to stop SQL
Server 2000 virtual server services. Although you can use Service Control Manager or Control Panel/Service to start and to stop the services without damaging the registry, these options will not cause the
services to stay in a stopped state. Instead, the services will be detected by the clustered server and you will receive multiple event ID 17052 error messages in you SQL Server that are similar to the following:
[sqsrvres] CheckServiceAlive: Service is dead
[sqsrvres] OnlineThread: service stopped while waiting for QP
[sqsrvres] OnlineThread: Error 1 bringing resource online
After you receive these error messages, SQL Server will be restarted by the cluster service. This behavior is expected for these types of errors.
================================================== ================================================== ========================
HTH,
Best Regards,
Uttam Parui
Microsoft Corporation
This posting is provided "AS IS" with no warranties, and confers no rights.
Are you secure? For information about the Strategic Technology Protection Program and to order your FREE Security Tool Kit, please visit http://www.microsoft.com/security.
Microsoft highly recommends that users with Internet access update their Microsoft software to better protect against viruses and security vulnerabilities. The easiest way to do this is to visit the following websites:
http://www.microsoft.com/protect
http://www.microsoft.com/security/guidance/default.mspx

Sunday, February 12, 2012

Best way to format a letter so that text can break across a page

I'm currently using SQL reporting services to produce letters. Most of the
formatting can be achieved however I'm having problems with page breaks.
I've currently set up the report using Text Boxes in the following format:
Name
Address
Salutation
Main Content
From
I need to ensure that the Main Content follows on directly from the
Salutation field. This works fine if the letter is 1 or 3 pages long.
However, in the case where the Main Content requires a few lines on page 2 it
places all the main content onto Page 2.
How can I ensure that the Main Content always follows directly on from the
Salutation? Is it possible to do this using a table instead of text boxes?
Any help would be greatly appreciated!I should have mentioned that the report is being produced in PDF format.
I read in a newsgroup article that PDF won't split a single row between 2
pages unless the row itself is longer than one page. Is this still the case
or is there a workaround? If this it true, then one solution might be to
bring back the main content text in sections although this isn't too
appealing!
"SV" wrote:
> I'm currently using SQL reporting services to produce letters. Most of the
> formatting can be achieved however I'm having problems with page breaks.
> I've currently set up the report using Text Boxes in the following format:
> Name
> Address
> Salutation
> Main Content
> From
> I need to ensure that the Main Content follows on directly from the
> Salutation field. This works fine if the letter is 1 or 3 pages long.
> However, in the case where the Main Content requires a few lines on page 2 it
> places all the main content onto Page 2.
> How can I ensure that the Main Content always follows directly on from the
> Salutation? Is it possible to do this using a table instead of text boxes?
> Any help would be greatly appreciated!