We want to restrict the ability to add a value in a field, if that value is already in the field.
i.e. we want to replicate the way Appsheet will restrict duplicate values in a โkeyโ field
But, we use a different โkey fieldโ.
I tried making this other field also a"key", but after saving it was deselected. So appears can only have 1 key.
I know IF and COUNT should be able to be used, but not sure how toโฆ
Thanks!
Solved! Go to Solution.
Hi @Tom_Graham,
If I have correctly understood your requirement ,could you try following expression in your SKU fieldโs valid_if expression
NOT(IN([_THIS], SELECT(Inventory Data[SKU], NOT(IN([Custom Label], LIST([_THISROW].[Custom Label]))))))
The above expression assumes that [Custom Label] is key field and ]SKU]is the non key field that you do not wish to be duplicated.
Hi @Tom_Graham,
Please explore if the approach described in the section " Preventing Duplicate Field Values" in the following article helps you
https://help.appsheet.com/en/articles/961274-list-expressions-and-aggregates
thanks @Suvrutt_Gurjar
Is it possible to achieve without creating and using a Computed Key?
i.e. โCustom Labelโ and โSKUโ both must contain โunique valuesโ. And we want โCustom Labelโ remain as the Key. How can we restrict SKU from accepting duplicates?
Thanks!
example of current formula inside the Valid IF field for โSKUโ field:
NOT(IN([_THIS], SELECT(Inventory Data[SKU], NOT(IN([SKU], LIST([_THISROW].[Custom Label]))))))
Hi @Tom_Graham,
If I have correctly understood your requirement ,could you try following expression in your SKU fieldโs valid_if expression
NOT(IN([_THIS], SELECT(Inventory Data[SKU], NOT(IN([Custom Label], LIST([_THISROW].[Custom Label]))))))
The above expression assumes that [Custom Label] is key field and ]SKU]is the non key field that you do not wish to be duplicated.
thanks - works perfectly now - thanks!
I am attempting to use this code to accomplish the same thing. Unfortunately, Iโve found an unwanted side effect is that it wonโt allow you to return to a record and edit it because it looks back at โitselfโ and says itโs a duplicate. Is there a solution to that problem? Here is my exact code:
NOT(IN([_THIS], SELECT(Level 3[Employee Number], NOT(IN([Employee Number], LIST([_THISROW].[Audit ID]))))))
Try:
ISBLANK(
FILTER(
"Level 3",
AND(
([_ROWNUMBER] <> [_THISROW].[_ROWNUMBER]),
([Employee Number] <> [_THISROW].[Audit ID]),
([Employee Number] = [_THIS])
)
)
)
Essentially: FILTER() finds all rows in the Level 3 table with a matching Employee Number, excluding rows where the rowโs Employee Number matches the Audit ID column value of this row, and excluding this row itself. If FILTER() returns anything after those exclusions, youโve got a duplicate value.
Just curious (and may not matter), but if the Audit ID and Employee Number are never equivalent (the Audit ID is a randomly generated alphanumeric string), is the second boolean test: ([Employee Number] <> [_THISROW].[Audit ID]) necessary?
Thatโs your call. You had it in your expression so I preserved it.
Dear All,
I have a table named "Import Waybill" with the below mentioned coloumns
Way Bill ID = (Unique ID) as the Key
Way Bill Date = Date
Way Bill No = Text
Truck Number Plate = Text
Waybill PK = Text with the app formula as = text([Waybill Date],"DD-MMM-YY")&"-"&[Waybill No]&"-"&[Truck Number Plate]
I wish to avoid entry of duplicate records which are of the same date, having the same way bill no and same truck plate number.
I tried using the the below mentioned formula in the coloumn Waybill PK
Valid If = NOT(IN([_ThisRow].[Waybill PK],SELECT(IMPORT WAYBILL[Waybill PK],[Waybill PK] <> [_ThisRow].[Waybill PK])))
and
NOT(IN([_This],SELECT(IMPORT WAYBILL[Waybill PK],[Waybill PK] <> [_ThisRow].[Waybill PK])))
Both did not work. Pls suggest if you have any better solution on the same.
Thanks all, I have figured out the solution for my query in the previous post.
AND(([Waybill Date]<>[_THISROW].[Waybill Date]),([Waybill No]<>[_THISROW].[Waybill No]),([Truck Number Plate]<>[_THISROW].[Truck Number Plate]))
User | Count |
---|---|
16 | |
10 | |
9 | |
8 | |
3 |