pct icon indicating copy to clipboard operation
pct copied to clipboard

DumpSchema / LoadSchema add destDir param

Open CIenthusiast opened this issue 2 years ago • 13 comments

Is your feature request related to a problem? Please describe. We sync the df-files from our Progress databases to a git-repo. Since we have a lot of tables / fields this file got way to large. Some git plugins get really slow processing big files. Also one file in a repo defeats the purpose of some nice features of git.

Describe the solution you'd like I would like to use PCT like this: <PCTDumpSchema destDir="${destDir}/${dbName}" dlcHome="${dlcHome}" cpInternal="utf-8" cpStream="utf-8" cpColl="basic" tempDir="${tempDir}">

I expect to find 10 df-files in "${destDir}/${dbName}" if i have 10 tables defined like: ${destDir}/${dbName}/<tableName>.df The same goes for the loading of schema.

For PCTIncrementalDump i propose a new Parameter destDir aswell which would save the delta per table (only if there is a delta ofcourse).

Describe alternatives you've considered 1 ) I tried to use the tables param, but to do so i first needed to get <PCTDumpData tables="_File" destDir="${destDir}" dlcHome="${dlcHome}" cpInternal="utf-8" cpStream="utf-8" cpColl="basic" tempDir="${tempDir}"> and comma-seperated-list out of this file. This list could be used in <PCTDumpSchema> again (for each entry in table list)

Additional context I looked inside your src and ended up in dmpSch.p RUN prodict/dump_df.p (INPUT cTables, INPUT cFile, INPUT SESSION:CPSTREAM). I assume that dmpSch.p is under the control of Progress.

CIenthusiast avatar Apr 23 '23 08:04 CIenthusiast

Sounds useful.

mikefechner avatar Apr 23 '23 09:04 mikefechner

Could be done. How do you plan to manage incremental dump in your repo (and how will you generate them) ?

gquerret avatar Apr 24 '23 05:04 gquerret

@gquerret Not sure if i understand you correctly, but our environment works like this:

Our devs have a local DB based on a df-file provided by the git-repository Theres a script which allows to sync the DB with the git-repo like this:

  1. Dump the local DB as df-file into this git-repository. (commit)
  2. Pull the latest df-file from the repo and merge them with our local changes. (fix merge-conflicts if any).
  3. Create a temporary, second DB based on the merged df-file
  4. Create a incremental df between those two local dbs
  5. Load the incremental df into dev-db
  6. Finally push the changes.

Now the GIt repo and the local DB should be the same. It may sound complicated but we automated this process pretty smoothly (including opening IDE for commit msg / conflicts) and it has increased our means for ci/cd a lot.

If you are interested in the CD processes (deploying to customers DB) let me know. Cheers!

CIenthusiast avatar Apr 24 '23 10:04 CIenthusiast

@gquerret I think that are different topics.

Versioning the .df per DB table will simplify things for the SCM side of life:

  • single history per file
  • faster diff view in the fancy graphical Git clients
  • smaller files in pull requests

Your delat.df question is very valid too. We also deploy a full DF with our application framework and create a delta.df on the fly in production.

But for that purpose, we can just concatenate all the ,df's into a single .df.

mikefechner avatar Apr 24 '23 10:04 mikefechner

@mikefechner @CIenthusiast I understand the need (or the preference) for a single DF per table. If you also store the incremental DF in the repo, would you want to generate them per table or per database ?

gquerret avatar Apr 24 '23 11:04 gquerret

you also store the incremental DF in the repo

I never do.

mikefechner avatar Apr 24 '23 11:04 mikefechner

I never do.

I think we already had this discussion, but this means that you have to handle field rename or add / delete during deployment.

gquerret avatar Apr 24 '23 11:04 gquerret

field renames? Better start with good names from the start ;)

mikefechner avatar Apr 24 '23 11:04 mikefechner

@mikefechner @CIenthusiast I understand the need (or the preference) for a single DF per table. If you also store the incremental DF in the repo, would you want to generate them per table or per database ?

I think both ways should be possible. There might be cases where u want a single file aswell (for CD purposes maybe)

CIenthusiast avatar Apr 24 '23 12:04 CIenthusiast

Incremental dump per table would require significant work.

gquerret avatar Apr 24 '23 13:04 gquerret

This would be a bonus really. What matters is dump / load per table.

CIenthusiast avatar Apr 24 '23 17:04 CIenthusiast

Any news / planned release on this?

CIenthusiast avatar Jul 19 '23 12:07 CIenthusiast

Unfortunately, I had no chance to work on that. You can open a PR if you want.

gquerret avatar Jul 31 '23 06:07 gquerret

Closed as won't fix.

gquerret avatar Oct 30 '25 08:10 gquerret