I think this is an excellent topic. I work in a regulated industry, and we need to move our systems through many environments before it's production ready.
Basically I've found no good way to automate this. I typically have a control script which calls other scripts. The other scripts are generated from PL/Dev either as a direct dump (for code, and triggers), or as a Compare (for objects).
1] Export all code (packages + triggers, etc)
2] Diff between environments for objects (tables, etc)
3] Perform data dump of ref data
This is still a very manualy process:
first, since you can't save a group of objects as an export set, you end up having to diff the results of [1] comparing with the last build. Just to make sure you got everything.
second, you have to visually check all sequences, since PL/DEV considers sequences at different numbers as being different;
third, you can't drive the tools from the command line (or save the tool parameters) so the data dumps are manualy each time (selecting the correct tables);
The common thread here seems to be the ability to save the parameters passed to Export Tables, Compare User Objects, and Export User Objects in some kind of set so they can be re-run consistently. Seems like an expansion of the Project concept.
In terms of source control, I have separate directories for Code, Data, Images, Objects (initial creation), Tests, Upgrade Scripts.
I use Tagging frequently to store the code state moving between environments.
Upgrade scripts contains the files required to move between environments (single file of code, single file of data, single file of object upgrades). I don't feel this is very efficient, but best I can devise so far.
~ ~ Dave