We have a data warehouse of approximately 200Gb. Our data is separated by
yearly quarters into separate tables and filegroups. Each quarter is in one
table and there is one table per filegroup. There is a view which unions all
the tables and makes the data accesible to the application. Currently we are
backing up the full database every night. I need to make a case to convice
management to archive the data and need to know what would be the best
approach. The main two factors to consider are speed and ease of recovering
the archived data if users need it. I am considering the options below. Whic
h
one would you recommend?
1. BCP out data and shrink filegroups - to recover, expand filegroups & bcp
in.
2. Same as #1 but instead move the data to a separate database and then back
that up or restore it to recover the data and copy it across.
3. Place data in separate databases and backup/drop/restore as needed (The
view will need to changed each time. I also don't know how this will affect
select performance if the view is spread across the databases on the same
server.)
4. Back up the individual filegroups, empty them, and then restore them
individually if needed. (don't know if this is possible)I have a similar situation at my shop. What we did was create an Archive db
on a separate server that was a restore of our Live db. We then truncated
out the current year's data from the Archive db. Next we truncated all the
previous year's data from the Live db. We left the tables in place so we
didn't have to update any of our views. Needless to say the queries on the
Live db are much faster since there is a lot less data. Our Live db is
about 150 GB and our Archive db is about 100 GB. The other thing that was
helpful about this scenario is that we kept our support tables in tact in
our Archive db. Additionally, we replicate the support tables from Live to
Archive. So if a request comes in from a customer who wanted information
from a previous year, it's a piece of cake to run the report on the Archive
db since all support tables/parameters are in place.
HTH.
Andre
"DBA72" <DBA72@.discussions.microsoft.com> wrote in message
news:9CE1DDB3-DB6D-42C3-9187-64B8E51212CE@.microsoft.com...
> We have a data warehouse of approximately 200Gb. Our data is separated by
> yearly quarters into separate tables and filegroups. Each quarter is in
one
> table and there is one table per filegroup. There is a view which unions
all
> the tables and makes the data accesible to the application. Currently we
are
> backing up the full database every night. I need to make a case to convice
> management to archive the data and need to know what would be the best
> approach. The main two factors to consider are speed and ease of
recovering
> the archived data if users need it. I am considering the options below.
Which
> one would you recommend?
> 1. BCP out data and shrink filegroups - to recover, expand filegroups &
bcp
> in.
> 2. Same as #1 but instead move the data to a separate database and then
back
> that up or restore it to recover the data and copy it across.
> 3. Place data in separate databases and backup/drop/restore as needed (The
> view will need to changed each time. I also don't know how this will
affect
> select performance if the view is spread across the databases on the same
> server.)
> 4. Back up the individual filegroups, empty them, and then restore them
> individually if needed. (don't know if this is possible)
No comments:
Post a Comment