This topic is locked

Best practice for local vs. production database

2/18/2010 9:39:16 AM
PHPRunner General questions
mcebula author

Hi all, is there a best practice for testing apps locally vs. production with regard to the database?
Do you use a single production database that is accessed remotely when testing changes locally (using WAMP or XAMPP for local PHP/Apache). Or do you keep a separate database (either remote or local) and switch to it when local testing? Maybe I missed it but I didn't see an easy way to change database connection information after a project was initially started.
The first method is easy - having 1 database and just FTP-ing the PHP files to production when ready but I worry a little about doing development using the production database.
I'm fine using single database if that's the norm but I just wanted check in and compare with what others who have more experience are doing.

A
Aleix 2/18/2010



Hi all, is there a best practice for testing apps locally vs. production with regard to the database?
Do you use a single production database that is accessed remotely when testing changes locally (using WAMP or XAMPP for local PHP/Apache). Or do you keep a separate database (either remote or local) and switch to it when local testing? Maybe I missed it but I didn't see an easy way to change database connection information after a project was initially started.
The first method is easy - having 1 database and just FTP-ing the PHP files to production when ready but I worry a little about doing development using the production database.
I'm fine using single database if that's the norm but I just wanted check in and compare with what others who have more experience are doing.


Hi;
We are working the same topic.
We like to work in this way:
1.- BBDD in a production mode.

2.- BBDD in a local develop/testing mode. In this way, if we do any thing wrong, it not affect to the production.
We do:

1.- With the PhpmyAdmin copy the production BBDD (structure and data.

2.- create BBDD in local or develop/testing server.

3.- Import the structure and data to the local or develop/testing BBDD.

4.- Open the project and save in the next version.

5.- Connect the project at the local or develop/testing BBDD.
It works fine, but it has a big problem.
The list pages are in blank, the other pages are ok and reports too
I do not know why the pages are in blank?
I do not Know if there are another way to woork?
Thanks

S
Sandi Jerome 2/18/2010

I use Navicat and have one copy of the database on my localhost and another copy on the server with it's own Navicat connection. I am connected to localhost with PHPRunner. If I make changes to localhost, I use Navicat to drag the new tables/views to the server.

Admin 2/18/2010

The best approach is to develop/test everything locally.
Once development phase is over you need to transfer tables and PHP files to production server. To transfer tables you can use the method Sansi mentioned or use any tool that can diff development and production databases and build a script to run on the server.
After files are FTP-ed to the server you need to update include/dbcommon.php file with production MySQL server address/username/password. In the next version we'll add a way to automate the last step.

E
electromotive 2/18/2010

Surprises during production upgrades are very embarrassing and too many will lead to a "you're fired"!
Not sure if its best practice, but here's what I do to minimise surprises.
There are 3-4 versions of the database.

  1. The production DB running on a headless linux/apache (LAMP) server, secure, UPS, etc.
  2. A pre-release DB version with the pending structural changes, also running on headless linux/apache (LAMP), but a low spec machine, just on the LAN.
  3. The current development DB version, running locally using generic XAMPP environment.
  4. Optional DB, any radical experimental DB we need, a subset usually, also local, with a different name. Thrown away after tests.
    The first 3 all have the same DB name, although the structure can vary, and certainly the data is outa step.

    The pre-release and current development versions get refreshed with real data periodically. Maybe once a month, but are always lagging.

    The production DB is journalled and backed up daily, and locked down.
    To provide some stability application updates are released to production probably no more than once or twice a month. All demos, testing and training is done on the pre-release DB. Once the pre-release is ready to go, then the production DB is fully backed up, the structure changes are implemented, SQL scripts as necessary to move things around or initialize new fields, and then the PHP is FTP'd, and the whole thing fully tested. This could mean the production DB is off-line for an hour during the upgrade, maybe slightly longer if we are also updating the LAMP. It can be regressed in about 10 mins.
    FTP is used to refresh the pre-release and production. Basically, I try and have the pre-release system look as much like production as possible so that any migration issues are dealt with, scripts are tested. For example, the Linux environment has case sensitive table names, whereas the Windows environment is not. There are likely other subtle differences between the XAMPP version and LAMP, such as PHP, MySQL and other libraries are at slightly different revisions, the production lagging to the most stable version. No need to keep the development lagging though, unless its a major version. Pre-release testing also means we can try the various workstations and browsers in use, get some idea of what performance is like on the actual network. The pre-release can be rebuilt to test the migration.
    If I'm are likely to come unstuck in a production upgrade, its more likely to be network routing, access controls or something like that. So I try not to be too ambitious and change too many things at once.
    To evaluate something like PHPR5.2, I start a fork on the development machine, but use the same DB. Once 5.2 is ready, then that fork will become primary and the 5.1 fork just kept as a backup. Once reasonable code is being generated, then the new fork also moves to the pre-release machine for wider testing and evaluation by the group.

A
Aleix 2/19/2010



Surprises during production upgrades are very embarrassing and too many will lead to a "you're fired"!
Not sure if its best practice, but here's what I do to minimise surprises.
There are 3-4 versions of the database.

  1. The production DB running on a headless linux/apache (LAMP) server, secure, UPS, etc.
  2. A pre-release DB version with the pending structural changes, also running on headless linux/apache (LAMP), but a low spec machine, just on the LAN.
  3. The current development DB version, running locally using generic XAMPP environment.
  4. Optional DB, any radical experimental DB we need, a subset usually, also local, with a different name. Thrown away after tests.
    The first 3 all have the same DB name, although the structure can vary, and certainly the data is outa step.

    The pre-release and current development versions get refreshed with real data periodically. Maybe once a month, but are always lagging.

    The production DB is journalled and backed up daily, and locked down.
    To provide some stability application updates are released to production probably no more than once or twice a month. All demos, testing and training is done on the pre-release DB. Once the pre-release is ready to go, then the production DB is fully backed up, the structure changes are implemented, SQL scripts as necessary to move things around or initialize new fields, and then the PHP is FTP'd, and the whole thing fully tested. This could mean the production DB is off-line for an hour during the upgrade, maybe slightly longer if we are also updating the LAMP. It can be regressed in about 10 mins.
    FTP is used to refresh the pre-release and production. Basically, I try and have the pre-release system look as much like production as possible so that any migration issues are dealt with, scripts are tested. For example, the Linux environment has case sensitive table names, whereas the Windows environment is not. There are likely other subtle differences between the XAMPP version and LAMP, such as PHP, MySQL and other libraries are at slightly different revisions, the production lagging to the most stable version. No need to keep the development lagging though, unless its a major version. Pre-release testing also means we can try the various workstations and browsers in use, get some idea of what performance is like on the actual network. The pre-release can be rebuilt to test the migration.
    If I'm are likely to come unstuck in a production upgrade, its more likely to be network routing, access controls or something like that. So I try not to be too ambitious and change too many things at once.
    To evaluate something like PHPR5.2, I start a fork on the development machine, but use the same DB. Once 5.2 is ready, then that fork will become primary and the 5.1 fork just kept as a backup. Once reasonable code is being generated, then the new fork also moves to the pre-release machine for wider testing and evaluation by the group.


Hi;
Great your ideas!
I am seeing the Navicat web and it looks fine (I do not try it)
The idea with pre-released DB, It is very good, becouse you can test it in a very similar environment. I will do it.
Mike if you want to connect to another DDBB you can do it with PHPRunner in the tables, Back button and then you can change the server or DDBB
One question Electro Rick. Why do you have the same DB name in the 3 environments? We do:
DBname_customerVersion_developVersion. In this way we can see easily which is the last development.
Another question is; While we are doing the development, in the production DDBB the users have changed the data. We solve in this way:

.1 production DDBB --> export data (with phpAdmin)

.2 development DDBB <-- import data (with phpAdmin)

.3 Test

.4 development DDBB --> export data and estructure (Now with phpAdmin, and we will tray the tools you say)

.5 production DDBB --> import data and estructure
Do you have another way?
Thanks

E
electromotive 3/15/2010



Hi;
Great your ideas!
I am seeing the Navicat web and it looks fine (I do not try it)
The idea with pre-released DB, It is very good, becouse you can test it in a very similar environment. I will do it.
Mike if you want to connect to another DDBB you can do it with PHPRunner in the tables, Back button and then you can change the server or DDBB
One question Electro Rick. Why do you have the same DB name in the 3 environments? We do:
DBname_customerVersion_developVersion. In this way we can see easily which is the last development.
Another question is; While we are doing the development, in the production DDBB the users have changed the data. We solve in this way:

.1 production DDBB --> export data (with phpAdmin)

.2 development DDBB <-- import data (with phpAdmin)

.3 Test

.4 development DDBB --> export data and estructure (Now with phpAdmin, and we will tray the tools you say)

.5 production DDBB --> import data and estructure
Do you have another way?
Thanks



Hiya Aleix
I think there is no one optimal way to manage production upgrades, it depends very much on the nature of the application, DB activity, tools, people involved, etc.
To respond to your first question. I use the same DB name because after testing on pre-release, I don't want to rebuild the application (or make any more changes ).
The way I keep track of the DB version, is I have a table in the DB called 'DB_Version'. This is a change log for the DB structure, notes what structure changes were made, and also identifies which builds of software each DB version is compatible with. So any time the pre-release DB structure gets updated a new entry is made in this table. I create each new record in the development version, but there is only an entry in the pre-release and production tables for versions that make it through. I don't bother propagating all the interim development DB changes, I stack or summarize those.
I also document the "build" of the PHPR generated software by adding a text field at the top of my logon screen, though it could be placed in headers, footers, menu, etc., any place that can be readily seen by users or testers. Each time I prepare a candidate version of software, I update the version number (major/minor), and update the DB_Version with compatibility requirements. Sounds more complicated than it is. Doesn't take any time at all and enforces some discipline.
Some more detail about this. In making changes to the DB such as adding new fields or tables, its possible that older versions of software will still work with these changes. So the DB can often be more advanced than the software. However, as soon as I update the software, then older versions of the database will not work. I document these dependencies in the DB_Version records.
Many possible reasons why we don't want to regenerate the code after testing. Maybe some developers modify (patch) the generated code, or they don't want to risk that maybe the version of the tool (PHPR) has changed since testing (the pre-release could lag behind development by weeks or loonger), or something else unexpected has changed/happened, etc ... Yes, its possible to update the DB parameters without regenerating the code by patching a file, but I chose not to do this for long term maintenance compatibility reasons.
Now to the second question. The production DB contains the only copy of "live" data. In my case the production DB is quite large, 100's MB, 70+ tables. Moving the whole thing around over a fast network takes a long time. The only time we "move the whole thing" is for backup purposes. Now it's unlikely that the structure of the production DB is the same as the development DB, and to refresh the development DB (from live data) usually some scripts are required to manipulate the data.
To periodically refresh my development and pre-release DBs (like once a month) I create a temporary copy of the production DB on the development machine, then create/use the scripts to alter/update the structure as required. I save these scripts for later. Once the scripts are ready, I can apply them to the DB and changes to the structure can be made in just minutes, much faster than copying the whole thing.
Once I have the scripts, its possible for me to make an on-server backup copy, update the structure of the production DB and upload the new software via FTP in less than 10 minutes, it's all pretested and ready to go. I do a quick check to make sure that I didn't botch the update, but I don't need to detail test everything before going live again.
If I copied the live data to the development machine, made the changes, and then copied it back, that whole process would take me hours, not including testing. Full testing on my application takes about a week, to make sure all the client-side and server-side validations work, lookups, links, search, page formating/labels is correct, etc.
If you had a much simplier system, and you didn't mind if the application was offline for a few hours, you could certainly use the copy-all method to update and test. That's what I used before the system went live in the first place.
On the other hand, for some applications even 10 min off-line is too long. With PHP/HTML its possible to make changes to a live system without needing to take it off-line, however the techniques of hot-updates are beyond the scope of this thread.