We work with two databases: a working database and a publication database.
In our working database, we consolidate objects that are already used in (published) models.
What is the easiest and most error-free way to transfer the consolidation to our publication database? Do you have any experience with this?
Option A:
We republish all models that use the "old" objects (i.e., the objects that are now being replaced by a master) in our publication database.
Problem: very high workload.
Option B:
We consolidate on both working databases.
Problem: we then have to consolidate using exact GUIDs. Since the GUIDs are not displayed in the standard overviews for consolidation, this is also very labor-intensive and prone to errors.
Do you have this problem too? How do you solve it?
I have submitted a suggestion for improvement: Consolidate objects across databases | Share your ideas
Please vote for it if you have the same problem.
Alexander Cherednichenko on
Hi,
IMHO, it’s worth addressing the root cause (why so many objects need consolidation) rather than focusing only on the symptom (the high workload caused by consolidation activities).
First, I would recommend analyzing what drives the creation of duplicates (or near-duplicates) in the working database and then putting a few straightforward preventive measures in place:
You may not be able to fully prevent the creation of new definitions, but you can ensure you detect them quickly. With the right structure and access rules, it becomes much easier to identify where/when new objects are created and to react early.
This typically includes elements such as:
3. Consistent enablement of modelers
Regular QA feedback loops, lessons learned, and documented best practices help significantly reduce “new duplicate creation” over time.
With these measures in place, the consolidation effort should decrease materially, and the problem becomes far more manageable.
Regards,