bulk insert not processed because of duplicate (using unique index composed by two columns) #9106
malalecherocks
started this conversation in
General
Replies: 1 comment 4 replies
-
Check the |
Beta Was this translation helpful? Give feedback.
4 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hi, my table:
contacts
id,team_id,name,phone
id(autogenerated uuid)
team_id (it's a foreign key that identifies the team that owns that contact)
name
phone
I have added to this table a UNIQUE INDEX composed by the columns (team_id, phone)
because each team will have several contacts, but always unique.
My problem comes when i have for example the data of one contact, and when I try to import several contacts at once, the insert returns DUPLICATED KEY, because I'm trying to insert a row with the same (team_id, phone) my problem is that the rest of the insert doesn't execute.
According to the docs that's normal, if one insert fails the rest of the insert will be rollback.
My question is... is there any workaround? I've been reading about upsert, but since I'm using a composed UNIQUE INDEX, is that feasable? I did some tests and I can't make it work...
my last solution will be creating a RPC and doing the inserts one by one... but I would like to use it sending the array of rows... and inserted those that aren't duplicated...
Thanks
Beta Was this translation helpful? Give feedback.
All reactions