Our DBA group is debating production release methodology. We always
perfect our deployment package (typically a script) against the
database in a Staging environment, before executing the package
against Production. One side argues that the safest approach is to
create a script containing all schema changes, data changes, stored
procs, functions, etc. Run that script against Staging until it is
error-free. Then, when you run it against Production on Release Night
you know it will also be error-free, since Staging is a copy of
Production. However, any changes to the deployment, such as updates
to procs (and there are always a bunch in the pre-deployment period)
must be manually implemented on the script.
The other side wants to take advantage of the .NET environment. We
keep all procs, views, and functions in a Release folder on VSS,
execute a Get Latest Version (Recursive) from VisualStudio, and run
against Staging/Production. Any changes to the procs are
automatically propagated from VSS to VS, and there is no manual
editing of the script. You still need a script for table and data
changes. The "script everything" side of the debate feels this method
is not as reliable, especially since when views are nested, you will
get compile errors and must execute the run-on several times until all
the views compile. The "script" side feels that any errors are
unacceptable in a Production environment.
What is the opinion out there?
Thanks!"Theodore Feldman" <theodore.feldman@.fmglobal.com> wrote in message
news:26032dec.0401090715.1721f762@.posting.google.c om...
> Hi,
> Our DBA group is debating production release methodology. We always
> perfect our deployment package (typically a script) against the
> database in a Staging environment, before executing the package
> against Production. One side argues that the safest approach is to
> create a script containing all schema changes, data changes, stored
> procs, functions, etc. Run that script against Staging until it is
> error-free. Then, when you run it against Production on Release Night
> you know it will also be error-free, since Staging is a copy of
> Production. However, any changes to the deployment, such as updates
> to procs (and there are always a bunch in the pre-deployment period)
> must be manually implemented on the script.
> The other side wants to take advantage of the .NET environment. We
> keep all procs, views, and functions in a Release folder on VSS,
> execute a Get Latest Version (Recursive) from VisualStudio, and run
> against Staging/Production. Any changes to the procs are
> automatically propagated from VSS to VS, and there is no manual
> editing of the script. You still need a script for table and data
> changes. The "script everything" side of the debate feels this method
> is not as reliable, especially since when views are nested, you will
> get compile errors and must execute the run-on several times until all
> the views compile. The "script" side feels that any errors are
> unacceptable in a Production environment.
> What is the opinion out there?
> Thanks!
One other possible approach is to prepare your Staging database with the
latest tables and other objects, taken from VSS, then use a DB comparison
tool to generate a sync script for deployment to production. This can save a
lot of time in deployment, and also makes it much harder to 'forget' any
minor or last minute changes.You would still need a script for data
modifications, of course. I use the Red Gate tools in more or less that way,
and they work well:
http://www.red-gate.com/sql/summary.htm
But there are plenty of other ways to deploy code, so it's probably going to
come down to whatever works best for your team, given the tools and
knowledge you have available.
Simon|||"Theodore Feldman" <theodore.feldman@.fmglobal.com> wrote in message
news:26032dec.0401090715.1721f762@.posting.google.c om...
> Hi,
> Our DBA group is debating production release methodology. We always
> perfect our deployment package (typically a script) against the
> database in a Staging environment, before executing the package
> against Production. One side argues that the safest approach is to
> create a script containing all schema changes, data changes, stored
> procs, functions, etc. Run that script against Staging until it is
> error-free. Then, when you run it against Production on Release Night
> you know it will also be error-free, since Staging is a copy of
> Production. However, any changes to the deployment, such as updates
> to procs (and there are always a bunch in the pre-deployment period)
> must be manually implemented on the script.
> The other side wants to take advantage of the .NET environment. We
> keep all procs, views, and functions in a Release folder on VSS,
> execute a Get Latest Version (Recursive) from VisualStudio, and run
> against Staging/Production. Any changes to the procs are
> automatically propagated from VSS to VS, and there is no manual
> editing of the script. You still need a script for table and data
> changes. The "script everything" side of the debate feels this method
> is not as reliable, especially since when views are nested, you will
> get compile errors and must execute the run-on several times until all
> the views compile. The "script" side feels that any errors are
> unacceptable in a Production environment.
> What is the opinion out there?
The database is still central and will always be so.
The application environment changes every season.
It might have been VB last year but this year its .net.
This is history.
It is for ever a compromise under current methodologies,
but my opinion is that applications should be written with
more and more of the code outside the application and in
the database environment.
In the end, there will only be the database. The application
environment external to the rdbms is a redundancy. I would
not be fuelling a redundancy unless you dont have any other
options.
Pete Brown
Falls Creek
Oz|||We keep all SQL scripts under source control and build release packages
based on only labeled versions. This provides more control over what gets
promoted. We use a custom .Net program to run the scripts, control the
execution sequence and provide enhanced logging. One can achieve similar
results with OSQL.
I suggest you treat database implementation script development like you do
code - as an iterative build process. If you find errors, correct the
source controlled scripts, create another release package build and try
again against a fresh staging database until you get it right. This
discipline will ensure that the same stuff that was blessed in your QA/UAT
environment is the same that gets promoted.
Also, check out http://www.abaris.se/abaperls/index.html for a public domain
configuration management toolset. Even if you choose not to use it, it may
give you some ideas for automating the CM process.
--
Hope this helps.
Dan Guzman
SQL Server MVP
"Theodore Feldman" <theodore.feldman@.fmglobal.com> wrote in message
news:26032dec.0401090715.1721f762@.posting.google.c om...
> Hi,
> Our DBA group is debating production release methodology. We always
> perfect our deployment package (typically a script) against the
> database in a Staging environment, before executing the package
> against Production. One side argues that the safest approach is to
> create a script containing all schema changes, data changes, stored
> procs, functions, etc. Run that script against Staging until it is
> error-free. Then, when you run it against Production on Release Night
> you know it will also be error-free, since Staging is a copy of
> Production. However, any changes to the deployment, such as updates
> to procs (and there are always a bunch in the pre-deployment period)
> must be manually implemented on the script.
> The other side wants to take advantage of the .NET environment. We
> keep all procs, views, and functions in a Release folder on VSS,
> execute a Get Latest Version (Recursive) from VisualStudio, and run
> against Staging/Production. Any changes to the procs are
> automatically propagated from VSS to VS, and there is no manual
> editing of the script. You still need a script for table and data
> changes. The "script everything" side of the debate feels this method
> is not as reliable, especially since when views are nested, you will
> get compile errors and must execute the run-on several times until all
> the views compile. The "script" side feels that any errors are
> unacceptable in a Production environment.
> What is the opinion out there?
> Thanks!|||theodore.feldman@.fmglobal.com (Theodore Feldman) wrote in message news:<26032dec.0401090715.1721f762@.posting.google.com>...
> Hi,
> Our DBA group is debating production release methodology. We always
> perfect our deployment package (typically a script) against the
> database in a Staging environment, before executing the package
> against Production. One side argues that the safest approach is to
> create a script containing all schema changes, data changes, stored
> procs, functions, etc. Run that script against Staging until it is
> error-free. Then, when you run it against Production on Release Night
> you know it will also be error-free, since Staging is a copy of
> Production. However, any changes to the deployment, such as updates
> to procs (and there are always a bunch in the pre-deployment period)
> must be manually implemented on the script.
> The other side wants to take advantage of the .NET environment. We
> keep all procs, views, and functions in a Release folder on VSS,
> execute a Get Latest Version (Recursive) from VisualStudio, and run
> against Staging/Production. Any changes to the procs are
> automatically propagated from VSS to VS, and there is no manual
> editing of the script. You still need a script for table and data
> changes. The "script everything" side of the debate feels this method
> is not as reliable, especially since when views are nested, you will
> get compile errors and must execute the run-on several times until all
> the views compile. The "script" side feels that any errors are
> unacceptable in a Production environment.
> What is the opinion out there?
> Thanks!
We use www.red-gate.com to generate a script that would update
production to the current version of the staging database. This only
applies to the schema (and some static data tables).
The rest, ie procs, views, functions, etc comes direct from
SourceSafe. If there are dependencies, we either use alpabetical
ordering (default) or a custom script that applies inner
functions/views etc first.
We have also setup an nightly batch process that does a trial upgrade
from staging to production every night to pick up errors as soon as
possible.
To me it sounds like your DBAs are trying to keep themselves in a job.
Manually creating these scripts can be a time consuming, laborious and
uninteresting process. They used to use this process at a previous
role and we changed|||[posted and mailed, please reply in news]
Theodore Feldman (theodore.feldman@.fmglobal.com) writes:
> Our DBA group is debating production release methodology. We always
> perfect our deployment package (typically a script) against the
> database in a Staging environment, before executing the package
> against Production. One side argues that the safest approach is to
> create a script containing all schema changes, data changes, stored
> procs, functions, etc. Run that script against Staging until it is
> error-free. Then, when you run it against Production on Release Night
> you know it will also be error-free, since Staging is a copy of
> Production. However, any changes to the deployment, such as updates
> to procs (and there are always a bunch in the pre-deployment period)
> must be manually implemented on the script.
> The other side wants to take advantage of the .NET environment. We
> keep all procs, views, and functions in a Release folder on VSS,
> execute a Get Latest Version (Recursive) from VisualStudio, and run
> against Staging/Production. Any changes to the procs are
> automatically propagated from VSS to VS, and there is no manual
> editing of the script. You still need a script for table and data
> changes. The "script everything" side of the debate feels this method
> is not as reliable, especially since when views are nested, you will
> get compile errors and must execute the run-on several times until all
> the views compile. The "script" side feels that any errors are
> unacceptable in a Production environment.
I just finished another posting on CM issues, see the topic "Script
Question", which gives an outline of how we work in our shop. Here
I will just supplement with your specific issues.
Basically, we go the script way, but we have a tool which is smart enough
to permit regeneration of the script as well as manual tweaks. This gives
the best of both worlds.
A completely manually maintained script is a fragile solution, which is
open to casual mistakes, and risky.
If I did not have a tool at hand, I would probably start from your
..Net alternative, and then find solutions to problems like nested views.
I recall that there is a stored procedure builder that comes with the
SQL Server 2000 Resource Kit. I have never came around to evaluate it,
but I know it works from SourceSafe.
--
Erland Sommarskog, SQL Server MVP, sommar@.algonet.se
Books Online for SQL Server SP3 at
http://www.microsoft.com/sql/techin.../2000/books.asp
No comments:
Post a Comment