CMDBuild Forum

Non-Unique issue during patch 35 on 3.2.0 upgrade

I upgraded from 3.1.1 to 3.2.0 and received a large amount of errors attempting to apply patch 35.
This was due to non-unique entries in the Map_FilterRole table for fields IdObj2 and DefaultFor for Status=A records.

The non-unique entries were added as a result of a bug in previous versions.

Can some advice be offered on the best way of resolving this issue.
I have manually adjusted the table to set the Status field to N for all the non-unique entries so I could complete the upgrade, however, I now have an issue where the Default Filters on some of my Classes are incorrect, and you cannot change them, it refuses to update the table and reports no error.
Can I remove the table entries completely to remove all Default filters, or does this table do more than just handle the Default filters?

Here is the error I received:
org.cmdbuild.dao.DaoException: error processing patch = Patch{version=3.2.0-35_improved_costraint_for_default_filter, category=core, applied=false}, caused by: org.cmdbuild.dao.DaoException: error processing patch = core-3.2.0-35_improved_costraint_for_default_filter (improved constraint for default filter), caused by: org.springframework.dao.DuplicateKeyException: StatementCallback; SQL [ SET SESSION cmdbuild.ignore_tenant_policies = ‘true’; SET SESSION cmdbuild.user_tenants = ‘{}’; DO $$ BEGIN RAISE NOTICE ‘apply cmdbuild patch %’, ‘core-3.2.0-35_improved_costraint_for_default_filter’; END $$ LANGUAGE PLPGSQL; – improved constraint for default filter SELECT _cm3_attribute_index_unique_create(’“Map_FilterRole”’,‘IdObj2’,‘DefaultFor’); DO $$ BEGIN RAISE NOTICE ‘applied cmdbuild patch %’, ‘core-3.2.0-35_improved_costraint_for_default_filter’; END $$ LANGUAGE PLPGSQL; ]; ERROR: could not create unique index “_cm3_Map_FilterRole_DefaultFor_IdObj2” Detail: Key (“IdObj2”, “DefaultFor”)=(24, 16798) is duplicated. Where: SQL statement “CREATE UNIQUE INDEX “_cm3_Map_FilterRole_DefaultFor_IdObj2” ON “Map_FilterRole” (“IdObj2”,“DefaultFor”) WHERE “Status” = ‘A’” PL/pgSQL function _cm3_attribute_index_create(regclass,boolean,character varying[]) line 29 at EXECUTE; nested exception is org.postgresql.util.PSQLException: ERROR: could not create unique index “_cm3_Map_FilterRole_DefaultFor_IdObj2” Detail: Key (“IdObj2”, “DefaultFor”)=(24, 16798) is duplicated. Where: SQL statement “CREATE UNIQUE INDEX “_cm3_Map_FilterRole_DefaultFor_IdObj2” ON “Map_FilterRole” (“IdObj2”,“DefaultFor”) WHERE “Status” = ‘A’” PL/pgSQL function _cm3_attribute_index_create(regclass,boolean,character varying[]) line 29 at EXECUTE, caused by: org.postgresql.util.PSQLException: ERROR: could not create unique index “_cm3_Map_FilterRole_DefaultFor_IdObj2” Detail: Key (“IdObj2”, “DefaultFor”)=(24, 16798) is duplicated. Where: SQL statement “CREATE UNIQUE INDEX “_cm3_Map_FilterRole_DefaultFor_IdObj2” ON “Map_FilterRole” (“IdObj2”,“DefaultFor”) WHERE “Status” = ‘A’” PL/pgSQL function _cm3_attribute_index_create(regclass,boolean,character varying[]) line 29 at EXECUTE

hello friend, i have the same error this happens to me after deploying openmaint in tomcat, did you find solution?

Here is an example of one of them.

To find the details needed to mark as inactive, filter by the values in the error during the upgrade:
SELECT * FROM public.“Map_FilterRole”
WHERE “Map_FilterRole”.“IdObj2”=16 AND “Map_FilterRole”.“DefaultFor”=17687
AND “Map_FilterRole”.“Status”=‘A’;

Then once you have worked out which ones are not valid, mark them as inactive:
UPDATE public.“Map_FilterRole”
SET “Status” = ‘N’
WHERE “Map_FilterRole”.“IdObj2” = 16 AND “Map_FilterRole”.“DefaultFor”=17687
AND “Map_FilterRole”.“IdObj1” != 53983

IdObj1 will be the unique value from the first query that you know is valid. I believe it’s fairly easy to spot the valid ones as they have a lot more data on the row.

I did manage to resolve it myself. There were some stale records in the tables indicated that were causing a key violation. From the look of it (and my recollection), there was an issue a while back that allowed you to create the same filters multiple times.
It was a pain to work out which were which, but I eventually figured out which records were not needed and manually deleted from from the database via pgAdmin. There were lots though, so you have to keep trying