Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Pledge amount greater than maxLovelace #1551

Closed
erikd opened this issue Jun 15, 2020 · 16 comments
Closed

Pledge amount greater than maxLovelace #1551

erikd opened this issue Jun 15, 2020 · 16 comments

Comments

@erikd
Copy link
Contributor

erikd commented Jun 15, 2020

Working on db-sync I had a failure to insert a PoolParam struct which after some debugging turned out to be due to a pledge amount being greater than maxLoveLace.

Obviously a pledge about of this size make no sense and the transaction including this pledge amount should be rejected and not included on the chain.

@erikd erikd added the bug Something isn't working label Jun 15, 2020
@nc6
Copy link
Contributor

nc6 commented Jun 15, 2020

There is a relevant point about bounded values for Coin in general - #1475 seems a good place to cover this. I will bump that one in priority.

@JaredCorduan
Copy link
Contributor

This is not a bug. Stake pool operators are free to make a pledge as large as they want, even ridiculous ones that they cannot possibly meet. Even just pledging 51% of the total supply is absurd, as you are making a promise to have enough stake to control the entire system. The ledger does not attempt to make a value judgement on what is and is not reasonable, instead it gives out no rewards to a pool that does not meet the pledge in a given epoch.

@JaredCorduan JaredCorduan removed the bug Something isn't working label Jun 15, 2020
@erikd
Copy link
Contributor Author

erikd commented Jun 17, 2020

This is not a bug.

Ok, not a bug, but definitely a significant WTF moment. In db-sync it means that the lovelace domain type I use for all other Coin values cannot be used for pledge. In db-sync I use domain types, and other SQL features to enforce invariants to reduce the chance of invalid data being inserted into the database. The lovelace invariant on pledge may not seem like much but its is something that gives me just that little bit safety.

@JaredCorduan
Copy link
Contributor

In db-sync I use domain types, and other SQL features to enforce invariants to reduce the chance of invalid data being inserted into the database. The lovelace invariant on pledge may not seem like much but its is something that gives me just that little bit safety.

That's great! as a part of #1475 we will introduce bounded values for Coin. We would prefer to use something like maxBound :: Word64 than something tied to the monetary policy.

@erikd
Copy link
Contributor Author

erikd commented Jun 22, 2020

The actual monetary policy max lovelace value is part of the ledger. It would be nice if the value was validated against that.

@erikd
Copy link
Contributor Author

erikd commented Jul 1, 2020

Allowing arbitrary values for the pedge amnount is a SERIOUS pain in the neck for db-sync. The fact that I cannot rely on the pledge amount being strictly [0 .. maxLovelace] means that I have to accept Word64 which is not handled natively by the Persistent library resulting in IntersectMBO/cardano-db-sync#163

Not fixing this in the LEDGER kicks the can along the road making the life of everyone who uses these values more difficult.

@erikd erikd reopened this Jul 1, 2020
@dcoutts
Copy link
Contributor

dcoutts commented Jul 1, 2020

We will have to handle Word64. Word64 occurs in many types, for example in metadata.

Large values will also occur as a result of aggregation queries (e.g. total amounts traded over some period of time).

@rdlrt
Copy link

rdlrt commented Jul 1, 2020

While use of supporting handling of Word64 sounds valid, I think the core requirement is to have validation of the values in formal executable spec instead of having to implement a solution at derivatives (secondary layer), absence of this could also - in turn - lead to difference of handling the parameters at different components, which sounds sub-optimal

@JaredCorduan
Copy link
Contributor

I understand that unbounded values are problematic, but I do not see any validation issue with a pledge larger than the max lovelace. the pledge is completely agnostic to the monetary policy. Why is a bound of maxLovelace preferred to maxBound :: Word64 by db-sync?

@papacarp
Copy link

papacarp commented Jul 2, 2020

validation of values that come out of cardano-node seems like a straightforward thing to implement. pooltool just got caught by a completely out of bounds value as well (1.1e19 cost on a pool). I agree with Erik, these types of unconstrained numbers (especially when they are straightforward to constrain at least at the IO level) will only lead to problems down the line.

@dcoutts
Copy link
Contributor

dcoutts commented Jul 3, 2020

Checking that a pledge is less then 4.5e10 ada is not validation. The only thing that matters is if the pool can meet their pledge, and there is no fixed level for that. Of course they cannot meet a pledge of 4.5e10 ada, but they also cannot meet pledges of much lower than that. But there is no fixed level for that.

Fundamentally, if pools want to shoot themselves in the foot they have many ways to do that, and we cannot stop them all, especially the ones with such grey areas like "is this a 'reasonable' cost or pledge".

instead of having to implement a solution at derivatives (secondary layer)

Like what? There is no need to implement anything. There's no fixed level you can pick in downstream components either. Just allow the full word64 range.

pooltool just got caught by a completely out of bounds value as well (1.1e19 cost on a pool)

What does that mean? It's completely legitimate for a pool to declare such a cost. They will not be competitive, but that's not a problem for the system.

@erikd
Copy link
Contributor Author

erikd commented Jul 25, 2020

db-sync now handles Word64 values correctly, but this still think that this provides pool operators with a significant WTF moment.

The current situation may be correctly considered "not a bug", but for the user it constitutes unexpected and surprising behavior.

@alexqrid
Copy link

Database filler stopped with this error. I just dropped the constraint lovelace_check of the lovelace domain and database started to fill. But I am not sure this is the right way. Attaching the logs here.

db-sync logs
Oct 17 15:29:53 cardano-fork cardano-db-sync[28903]: [db-sync-node:Info:4] [2020-10-17 15:29:53.37 UTC] localInitiatorNetworkApplication: connecting to node via "state-node-mainnet/sock"
Oct 17 15:29:53 cardano-fork cardano-db-sync[28903]: [db-sync-node.Handshake:Info:28] [2020-10-17 15:29:53.37 UTC] [String "Send MsgProposeVersions (fromList [(NodeToClientV_1,TInt 764824073),(NodeToClientV_2,TInt 764824073),(NodeToClientV_3,TInt 764824073)])",String "LocalHandshakeTrace",String "ConnectionId {localAddress = LocalAddress {getFilePath = \"\"}, remoteAddress = LocalAddress {getFilePath = \"state-node-mainnet/sock\"}}"]
Oct 17 15:29:53 cardano-fork cardano-db-sync[28903]: [db-sync-node.Handshake:Info:28] [2020-10-17 15:29:53.37 UTC] [String "Recv MsgAcceptVersion NodeToClientV_3 (TInt 764824073)",String "LocalHandshakeTrace",String "ConnectionId {localAddress = LocalAddress {getFilePath = \"\"}, remoteAddress = LocalAddress {getFilePath = \"state-node-mainnet/sock\"}}"]
Oct 17 15:29:53 cardano-fork cardano-db-sync[28903]: [db-sync-node:Info:32] [2020-10-17 15:29:53.37 UTC] Starting chainSyncClient
Oct 17 15:29:53 cardano-fork cardano-db-sync[28903]: [db-sync-node:Info:32] [2020-10-17 15:29:53.48 UTC] Cardano.Db tip is at slot 11369248, block 4830407
Oct 17 15:29:53 cardano-fork cardano-db-sync[28903]: [db-sync-node:Info:38] [2020-10-17 15:29:53.48 UTC] Running DB thread
Oct 17 15:29:53 cardano-fork cardano-db-sync[28903]: [db-sync-node:Info:38] [2020-10-17 15:29:53.61 UTC] Rolling back to slot 11369248, hash c847d4ffc835132dcfc07c3ff1e10c8090cdaaf9c05bea5a0b9132470aec64ba
Oct 17 15:29:53 cardano-fork cardano-db-sync[28903]: [db-sync-node:Info:38] [2020-10-17 15:29:53.61 UTC] Deleting blocks numbered: []
Oct 17 15:29:53 cardano-fork cardano-db-sync[28903]: [db-sync-node:Info:41] [2020-10-17 15:29:53.64 UTC] getHistoryInterpreter: acquired
Oct 17 15:29:53 cardano-fork cardano-db-sync[28903]: [db-sync-node:Error:38] [2020-10-17 15:29:53.67 UTC] Bad pledge amount: 9999372036855000000 > maxLovelace. See https://github.com/input-output-hk/cardano-ledger-specs/issues/1551
Oct 17 15:29:55 cardano-fork cardano-db-sync[28903]: [db-sync-node:Error:38] [2020-10-17 15:29:55.32 UTC] Bad pledge amount: 9999372036855000000 > maxLovelace. See https://github.com/input-output-hk/cardano-ledger-specs/issues/1551
Oct 17 15:29:55 cardano-fork cardano-db-sync[28903]: [db-sync-node:Error:38] [2020-10-17 15:29:55.45 UTC] Bad pledge amount: 9999372036855000000 > maxLovelace. See https://github.com/input-output-hk/cardano-ledger-specs/issues/1551
Oct 17 15:29:59 cardano-fork cardano-db-sync[28903]: [db-sync-node:Info:38] [2020-10-17 15:29:59.59 UTC] insertShelleyBlock: epoch 223, slot 11382211, block 4831036, hash 4e27af92d5d366fededd9f8fc6b7855c032a2f9ec1311c5a7cfc975fa5f3c7ec
Oct 17 15:29:59 cardano-fork cardano-db-sync[28903]: [db-sync-node:Info:38] [2020-10-17 15:29:59.59 UTC] insertShelleyBlock: epoch 223, slot 11382213, block 4831037, hash d7d9fff811b0ad502ec562722bb32c4a08bd56c36ecf173470fba51c013ec3b4

If this is the right way to solve the problem. May be it's needed to change schema.

@erikd
Copy link
Contributor Author

erikd commented Oct 19, 2020

@alexqrid This was due to an philosophic impedance mismatch between the what ledger-specs enforces and what db-sync (including PostgreSQL) can deal with.

The specific problem was fixed in IntersectMBO/cardano-db-sync#352 and the more general philosophical problem was fixed in IntersectMBO/cardano-db-sync#358 .

From now until the next release from the master (hopefully early this week) branch, my advise here still stands.

@papacarp
Copy link

I still don't understand the justification for not screening nonsensical values on pool registration. As a pool explorer, I am now bound to faithfully display a pledge of 72B ADA to reflect the actual value as registered even though it makes no sense.

I agree that there will need to be unbounded integer values all over the infrastructure, but where we can bound them we should.

My 0.02 lovelaces

@dcoutts
Copy link
Contributor

dcoutts commented Oct 19, 2020

The philosophy of the Shelley ledger rules is to enforce only the necessary constraints. It's the Occam's razor approach: if a constraint is not necessary for the correct functioning of the ledger then it is not included. This gives an intellectual clarity: it should be possible to point at any constraint in the rules and say why it's needed. This would not be the case if there were redundant checks.

This philosophy means for example that the ledger rules do not prevent people from declaring pool pledges that they could not possibly satisfy. It is not prevented because it is unnecessary to do so: the pledge will not be satisfied anyway.

Similarly the ledger rules do not check that tx outputs are less than the total circulation in supply. It is unnecessary to do that, since such a check is redundant. It is already checked by the fact that the inputs and outputs must balance.

As a pool explorer, I am now bound to faithfully display a pledge of 72B ADA to reflect the actual value as registered even though it makes no sense.

That is true, but this is unavoidable. There is no cut-off could we pick that would separate the sense from the nonsense. Is a 200M pledge sense or nonsense? What about 300M? The fact is it is impossible to say in advance what values can make any sense, and this will change over time as the amount in circulation grows and the value of K changes. What we can say precisely is when a pledge is being met, either right now or at an epoch boundary.

So what Daedalus does, and what a pool explorer could do, is to show when a pool's pledge is not currently being met. This is much more useful than trying to find some line in the sand for unrealistically large pledges. Then it doesn't matter what stupid value someone has pledged: they're not meeting their pledge. You don't even have to show what the pledge value is in that case.

I agree that there will need to be unbounded integer values all over the infrastructure, but where we can bound them we should.

And note that we do bound them, by bounding types like Word64 or Int64.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

7 participants