Showing posts with label group. Show all posts
Showing posts with label group. Show all posts

Wednesday, March 28, 2012

Production release methodology

Hi,

Our DBA group is debating production release methodology. We always
perfect our deployment package (typically a script) against the
database in a Staging environment, before executing the package
against Production. One side argues that the safest approach is to
create a script containing all schema changes, data changes, stored
procs, functions, etc. Run that script against Staging until it is
error-free. Then, when you run it against Production on Release Night
you know it will also be error-free, since Staging is a copy of
Production. However, any changes to the deployment, such as updates
to procs (and there are always a bunch in the pre-deployment period)
must be manually implemented on the script.

The other side wants to take advantage of the .NET environment. We
keep all procs, views, and functions in a Release folder on VSS,
execute a Get Latest Version (Recursive) from VisualStudio, and run
against Staging/Production. Any changes to the procs are
automatically propagated from VSS to VS, and there is no manual
editing of the script. You still need a script for table and data
changes. The "script everything" side of the debate feels this method
is not as reliable, especially since when views are nested, you will
get compile errors and must execute the run-on several times until all
the views compile. The "script" side feels that any errors are
unacceptable in a Production environment.

What is the opinion out there?

Thanks!"Theodore Feldman" <theodore.feldman@.fmglobal.com> wrote in message
news:26032dec.0401090715.1721f762@.posting.google.c om...
> Hi,
> Our DBA group is debating production release methodology. We always
> perfect our deployment package (typically a script) against the
> database in a Staging environment, before executing the package
> against Production. One side argues that the safest approach is to
> create a script containing all schema changes, data changes, stored
> procs, functions, etc. Run that script against Staging until it is
> error-free. Then, when you run it against Production on Release Night
> you know it will also be error-free, since Staging is a copy of
> Production. However, any changes to the deployment, such as updates
> to procs (and there are always a bunch in the pre-deployment period)
> must be manually implemented on the script.
> The other side wants to take advantage of the .NET environment. We
> keep all procs, views, and functions in a Release folder on VSS,
> execute a Get Latest Version (Recursive) from VisualStudio, and run
> against Staging/Production. Any changes to the procs are
> automatically propagated from VSS to VS, and there is no manual
> editing of the script. You still need a script for table and data
> changes. The "script everything" side of the debate feels this method
> is not as reliable, especially since when views are nested, you will
> get compile errors and must execute the run-on several times until all
> the views compile. The "script" side feels that any errors are
> unacceptable in a Production environment.
> What is the opinion out there?
> Thanks!

One other possible approach is to prepare your Staging database with the
latest tables and other objects, taken from VSS, then use a DB comparison
tool to generate a sync script for deployment to production. This can save a
lot of time in deployment, and also makes it much harder to 'forget' any
minor or last minute changes.You would still need a script for data
modifications, of course. I use the Red Gate tools in more or less that way,
and they work well:

http://www.red-gate.com/sql/summary.htm

But there are plenty of other ways to deploy code, so it's probably going to
come down to whatever works best for your team, given the tools and
knowledge you have available.

Simon|||"Theodore Feldman" <theodore.feldman@.fmglobal.com> wrote in message
news:26032dec.0401090715.1721f762@.posting.google.c om...
> Hi,
> Our DBA group is debating production release methodology. We always
> perfect our deployment package (typically a script) against the
> database in a Staging environment, before executing the package
> against Production. One side argues that the safest approach is to
> create a script containing all schema changes, data changes, stored
> procs, functions, etc. Run that script against Staging until it is
> error-free. Then, when you run it against Production on Release Night
> you know it will also be error-free, since Staging is a copy of
> Production. However, any changes to the deployment, such as updates
> to procs (and there are always a bunch in the pre-deployment period)
> must be manually implemented on the script.
> The other side wants to take advantage of the .NET environment. We
> keep all procs, views, and functions in a Release folder on VSS,
> execute a Get Latest Version (Recursive) from VisualStudio, and run
> against Staging/Production. Any changes to the procs are
> automatically propagated from VSS to VS, and there is no manual
> editing of the script. You still need a script for table and data
> changes. The "script everything" side of the debate feels this method
> is not as reliable, especially since when views are nested, you will
> get compile errors and must execute the run-on several times until all
> the views compile. The "script" side feels that any errors are
> unacceptable in a Production environment.
> What is the opinion out there?

The database is still central and will always be so.
The application environment changes every season.
It might have been VB last year but this year its .net.
This is history.

It is for ever a compromise under current methodologies,
but my opinion is that applications should be written with
more and more of the code outside the application and in
the database environment.

In the end, there will only be the database. The application
environment external to the rdbms is a redundancy. I would
not be fuelling a redundancy unless you dont have any other
options.

Pete Brown
Falls Creek
Oz|||We keep all SQL scripts under source control and build release packages
based on only labeled versions. This provides more control over what gets
promoted. We use a custom .Net program to run the scripts, control the
execution sequence and provide enhanced logging. One can achieve similar
results with OSQL.

I suggest you treat database implementation script development like you do
code - as an iterative build process. If you find errors, correct the
source controlled scripts, create another release package build and try
again against a fresh staging database until you get it right. This
discipline will ensure that the same stuff that was blessed in your QA/UAT
environment is the same that gets promoted.

Also, check out http://www.abaris.se/abaperls/index.html for a public domain
configuration management toolset. Even if you choose not to use it, it may
give you some ideas for automating the CM process.

--
Hope this helps.

Dan Guzman
SQL Server MVP

"Theodore Feldman" <theodore.feldman@.fmglobal.com> wrote in message
news:26032dec.0401090715.1721f762@.posting.google.c om...
> Hi,
> Our DBA group is debating production release methodology. We always
> perfect our deployment package (typically a script) against the
> database in a Staging environment, before executing the package
> against Production. One side argues that the safest approach is to
> create a script containing all schema changes, data changes, stored
> procs, functions, etc. Run that script against Staging until it is
> error-free. Then, when you run it against Production on Release Night
> you know it will also be error-free, since Staging is a copy of
> Production. However, any changes to the deployment, such as updates
> to procs (and there are always a bunch in the pre-deployment period)
> must be manually implemented on the script.
> The other side wants to take advantage of the .NET environment. We
> keep all procs, views, and functions in a Release folder on VSS,
> execute a Get Latest Version (Recursive) from VisualStudio, and run
> against Staging/Production. Any changes to the procs are
> automatically propagated from VSS to VS, and there is no manual
> editing of the script. You still need a script for table and data
> changes. The "script everything" side of the debate feels this method
> is not as reliable, especially since when views are nested, you will
> get compile errors and must execute the run-on several times until all
> the views compile. The "script" side feels that any errors are
> unacceptable in a Production environment.
> What is the opinion out there?
> Thanks!|||theodore.feldman@.fmglobal.com (Theodore Feldman) wrote in message news:<26032dec.0401090715.1721f762@.posting.google.com>...
> Hi,
> Our DBA group is debating production release methodology. We always
> perfect our deployment package (typically a script) against the
> database in a Staging environment, before executing the package
> against Production. One side argues that the safest approach is to
> create a script containing all schema changes, data changes, stored
> procs, functions, etc. Run that script against Staging until it is
> error-free. Then, when you run it against Production on Release Night
> you know it will also be error-free, since Staging is a copy of
> Production. However, any changes to the deployment, such as updates
> to procs (and there are always a bunch in the pre-deployment period)
> must be manually implemented on the script.
> The other side wants to take advantage of the .NET environment. We
> keep all procs, views, and functions in a Release folder on VSS,
> execute a Get Latest Version (Recursive) from VisualStudio, and run
> against Staging/Production. Any changes to the procs are
> automatically propagated from VSS to VS, and there is no manual
> editing of the script. You still need a script for table and data
> changes. The "script everything" side of the debate feels this method
> is not as reliable, especially since when views are nested, you will
> get compile errors and must execute the run-on several times until all
> the views compile. The "script" side feels that any errors are
> unacceptable in a Production environment.
> What is the opinion out there?
> Thanks!

We use www.red-gate.com to generate a script that would update
production to the current version of the staging database. This only
applies to the schema (and some static data tables).

The rest, ie procs, views, functions, etc comes direct from
SourceSafe. If there are dependencies, we either use alpabetical
ordering (default) or a custom script that applies inner
functions/views etc first.

We have also setup an nightly batch process that does a trial upgrade
from staging to production every night to pick up errors as soon as
possible.

To me it sounds like your DBAs are trying to keep themselves in a job.
Manually creating these scripts can be a time consuming, laborious and
uninteresting process. They used to use this process at a previous
role and we changed|||[posted and mailed, please reply in news]

Theodore Feldman (theodore.feldman@.fmglobal.com) writes:
> Our DBA group is debating production release methodology. We always
> perfect our deployment package (typically a script) against the
> database in a Staging environment, before executing the package
> against Production. One side argues that the safest approach is to
> create a script containing all schema changes, data changes, stored
> procs, functions, etc. Run that script against Staging until it is
> error-free. Then, when you run it against Production on Release Night
> you know it will also be error-free, since Staging is a copy of
> Production. However, any changes to the deployment, such as updates
> to procs (and there are always a bunch in the pre-deployment period)
> must be manually implemented on the script.
> The other side wants to take advantage of the .NET environment. We
> keep all procs, views, and functions in a Release folder on VSS,
> execute a Get Latest Version (Recursive) from VisualStudio, and run
> against Staging/Production. Any changes to the procs are
> automatically propagated from VSS to VS, and there is no manual
> editing of the script. You still need a script for table and data
> changes. The "script everything" side of the debate feels this method
> is not as reliable, especially since when views are nested, you will
> get compile errors and must execute the run-on several times until all
> the views compile. The "script" side feels that any errors are
> unacceptable in a Production environment.

I just finished another posting on CM issues, see the topic "Script
Question", which gives an outline of how we work in our shop. Here
I will just supplement with your specific issues.

Basically, we go the script way, but we have a tool which is smart enough
to permit regeneration of the script as well as manual tweaks. This gives
the best of both worlds.

A completely manually maintained script is a fragile solution, which is
open to casual mistakes, and risky.

If I did not have a tool at hand, I would probably start from your
..Net alternative, and then find solutions to problems like nested views.

I recall that there is a stored procedure builder that comes with the
SQL Server 2000 Resource Kit. I have never came around to evaluate it,
but I know it works from SourceSafe.

--
Erland Sommarskog, SQL Server MVP, sommar@.algonet.se

Books Online for SQL Server SP3 at
http://www.microsoft.com/sql/techin.../2000/books.asp

Tuesday, March 20, 2012

Processing one 'fact partition or measure group' weekly, while others are daily?

Is this possible to have done in 2 different (or 1 intelligent) AS deployment script(s)? Sunday would be the full processing day, while other days would just handle the smaller fact areas.

You could use Integration Services with XMLA scripts to accomplish this...

Frank

|||

Is this type of thing something that's available in standard edition? I tried making a new connection with type of Integration Services, but I'm not sure where to go from there, as the object exlorer appears to give no options to make a new package or whatever route I'd need to take to make a new schedule.

Could you provide more info as to how it would be created and then scheduled (or is this a manual thing if not an option provided by SSIS)?

|||

Yes, this is available in the standard edition.

how to create an SSIS package: http://msdn2.microsoft.com/en-us/library/ms141178.aspx

This thread explains a couple of different ways to do this: http://forums.microsoft.com/MSDN/ShowPost.aspx?PostID=1626707&SiteID=1

Frank|||

I will take a look over the article. Thanks.

I also want to ask again, is it possible to process all fact tables but one (not by specifying the name of the facts, but only the name of the one I don't want to process)?

Friday, March 9, 2012

ProcessAdd processing hangs

I'm trying to do an incremental process of a cube that has 3 measure groups. The first measure group is fairly straightforward, run-of-the-mill measures. The other 2 are distinct counts of an ID.

When running the incremental process (ProcessAdd), the status seems to complete, and then everything just hangs. I can't find anything that the server is doing, but it won't return. This happens in BIDS using debug, management studio wizard, and management studio XMLA execution windows.

Sometimes I can get the first measure group to return... Once the 3rd returned. The 2nd has never worked.

Thoughts on what's happeneing? Does anyone know how to see what's going on in the background? I have a SQL trace going and a trace on the SSAS server, and nothing seems to be going on...

Thanks!
-DougI have a similar issue occurring. I have an SSIS package that uses an analysis services task to incrementally update the most recent partition in three measure groups. It hangs everytime when run under the SQL agent Job engine, however, it processes fine through the SSIS debugger in BIDS.

I think I'm going to try submitting the XMLA statement directly from the job rather than going through SSIS. Did you ever figure out the issue you were seeing?|||

Try and run Profiler against Analysis Server and see if there is any activity is goign on.

Make sure you've got latest service pack installed. And lastly: if the problem still persists, contact product support services and report it.

Edward Melomed.
--
This posting is provided "AS IS" with no warranties, and confers no rights.

ProcessAdd processing hangs

I'm trying to do an incremental process of a cube that has 3 measure groups. The first measure group is fairly straightforward, run-of-the-mill measures. The other 2 are distinct counts of an ID.

When running the incremental process (ProcessAdd), the status seems to complete, and then everything just hangs. I can't find anything that the server is doing, but it won't return. This happens in BIDS using debug, management studio wizard, and management studio XMLA execution windows.

Sometimes I can get the first measure group to return... Once the 3rd returned. The 2nd has never worked.

Thoughts on what's happeneing? Does anyone know how to see what's going on in the background? I have a SQL trace going and a trace on the SSAS server, and nothing seems to be going on...

Thanks!
-DougI have a similar issue occurring. I have an SSIS package that uses an analysis services task to incrementally update the most recent partition in three measure groups. It hangs everytime when run under the SQL agent Job engine, however, it processes fine through the SSIS debugger in BIDS.

I think I'm going to try submitting the XMLA statement directly from the job rather than going through SSIS. Did you ever figure out the issue you were seeing?|||

Try and run Profiler against Analysis Server and see if there is any activity is goign on.

Make sure you've got latest service pack installed. And lastly: if the problem still persists, contact product support services and report it.

Edward Melomed.
--
This posting is provided "AS IS" with no warranties, and confers no rights.