Bounty Challenge: Prove that Badger loses data

Moved from GitHub badger/601

Posted by danielmai:

(This is a renewal of the challenge initiated in #570)

The Challenge

We are always looking to proactively fix issues in our software. If you can prove that Badger loses data under certain conditions and provide a reproducible set of instructions of the data loss, we’ll pay you $1337 as a cash reward for your finding. You find an issue in Badger and get paid, and we get to improve our software. It’s a win-win.

Conditions

  • Badger must be run with SyncWrites=true enabled (the default). Badger only promises durability of successful writes.
  • A valid data loss bug would use public APIs of Badger and use them in the correct way. Creating unwanted behavior by using internal functions, or using the public APIs in an incorrect way, can still be useful to make Badger more robust, but won’t constitute a valid entry for this challenge.
  • The steps must be reproducible by the Dgraph team. If we can’t reproduce them at our end, we won’t consider it a valid entry.
  • The setup must be general enough. If it requires a supercomputer or some crazy RAID array to run, we won’t be able to reproduce the issue.
  • We try to keep things simple and intend to pay the same amount for valid issues. However, if the bug only causes Badger to not return data, and the data is fully recoverable by an easy fix to the code – that’s technically not a data loss (See #578). For those bugs, we reserve the right to pay less than the above mentioned bounty amount. The amount for these bugs will be decided based an internal team review.
  • Challenge participants must adhere to our Code of Conduct. To summarize it briefly, don’t be destructive or inflammatory.
  • In cases of disagreement, the Dgraph team will make the final decision.

Submit an entry

If you have a reproducible data loss bug, don’t reply here. File a separate GitHub issue with a link to this issue and we’ll evaluate it.

networkimprov commented :

A data-loss bug was already filed when you posted this issue. I’d like to collect my bounty :slight_smile:

#563

stale commented :

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.

stale commented :

This issue was marked as stale and no activity has occurred since then, therefore it will now be closed. Please, reopen if the issue is still relevant.

simonxw commented :

I think #995 is a DL issue.

ForCraft2013 commented :

Maybe this can prove the issue?
https://github.com/dgraph-io/badger/issues/1126

ForCraft2013 commented :

@danielmai how can I contact with you? In the already closed issue Data loss while reading from DB · Issue #1126 · dgraph-io/badger · GitHub you said to email you, but there was no response for long time (10 days). Maybe my emails got into spam… :pensive:

danielmai commented :

I’ve replied to your recent email @ForCraft2013. I never saw the older ones, so thanks for reaching out again.

GameXG commented :

hi,
#1280 Comply with standards?

code: Sequence generates duplicate values · Issue #1280 · dgraph-io/badger · GitHub

swdee commented :

Hi,

I believe issue Question: Insert on Duplicate Transaction · Issue #1289 · dgraph-io/badger · GitHub meets your bounty criteria, as the bug is causing data loss due to conflict transactions overwriting each other. The values of the last transaction are the ones that get written to the DB instead of the first transaction and subsequent ones returning ErrConflict.

jarifibrahim commented :

@swdee Thank you for finding the issue but it isn’t a data loss issue. The data is actually written to the disk always. It was a bug in the iteration code because of which the check for key existence was failing. We appreciate your effort in improving badger. Cheers :stars:

damz commented :

#1422 is a data loss issue (Badger is losing delete markers, which are data), reproducible using the public API in the way it is intended to be used, that cannot be recovered once it happened.

1 Like