Error - "ErrorInvalidRequest"

Getting this error for below query

{
GetDetails(func: eq(UserID, “userid”)) {
expand(all) {
expand(all)
}
}
}
Response:

{

"errors": [

    {

        "code": "ErrorInvalidRequest",

        "message": ": Unable to read from value log: {Fid:234 Len:92 Offset:24075916}: Unable to find log file. Please retry"

    }

{

"errors": [

    {

        "code": "ErrorInvalidRequest",

        "message": ": Unable to read from value log: {Fid:234 Len:92 Offset:24075916}: Unable to find log file. Please retry"

    }

],

"data": null

}
],

"data": null

}

Same asked here. not sure why this error is coming

Feels like your DB got corrupted due a disk or RAM issue. You should use HA and give more resources to your cluster. The best you can do now is try to export your dataset and create a new cluster.

But this is not happening for all nodes. very few. so couldn’t figure what went wrong. Also im not able to delete that node with uid. same error is popping up

Do what I said. It will stop.
You may have your cluster compromised due some corruption. It may the timestamps, the data itself, or the data in the disk have a corruption due some disk failure. If you have HA this wont happen.

If you give low resources to your cluster, mean you don’t give a correct dimensioning to your cluster, it will end up like that. That best way to recover from that is exporting and doing it right.

Hi, We upgraded our db to latest version 22.0.2, but after trying to import data from rdf file getting below error. how to move forward.

2023/02/14 10:56:30 RDF doesn’t match schema: Input for predicate “Country” of type scalar is uid. Edge: entity:966178 attr:"\000\000\000\000\000\000\000\000Country" value_type:UID value_id:6338232
root@19e11662e25f:/dgraph#

Check your mutation. This error is saying that you are trying to write any scalar value instead of the one registered in the schema which is UID.

Not able to do query as the above-given error is failing the data import. this error is coming while trying to import the rdf and schema file in docker with a command in linux like below.

$ dgraph bulk -f goldendata.rdf.gz -s goldendata.schema --http localhost:8000 --zero=localhost:5080

Yeah. The problem relies in that data "RDF doesn’t match schema". Have you added a schema before loading that data? if so, you previous schema sets the Country UID not a value.

Check your schema and take a look at your data. If this is just a test env. Delete all data in Dgraph’s paths and load that data again. This looks like Dgraph’s dataset that you may have found in our repos. If from a clean load it still gives that error. I gonna check if the dataset is ok or need to be edited.

is there any way if we can edit/correct the dataset in rdf.gz by identifying corrupt data?

Have you tested what I said? If so, I can update the dataset. But if you wanna do it. You can just unzip it, open in any IDE and use regex to find the problematic part. Either be in the schema or the dataset.

Thanks, @MichelDiz we were able to bulk-load the data by editing our rdf file. But after a successful load, we are still not able to see our data in the ratel. We checked our docker out and p directories are created.

We also tried doing a live load after this but we are getting context deadline exceeded. We tried both using localhost and docker network IP. Also when we imported data, we saw the schema file is empty.

PS: we are upgrading from v1.0.5 to v22.0.2

Wow, that’s super old. You have a long way. A lot of things have changed.

Read this https://dgraph.io/docs/deploy/admin/
https://dgraph.io/docs/deploy/admin/
Also read this https://dgraph.io/docs/dql/dql-schema/

You may have to add Types in the schema and also add “dgraph.type: “User”” for example.

Thanks for the links @MichelDiz

Any idea on how to fix empty schema after successful export and import? We are still not able to see any data in upgraded ratel dashboard

This may happen cuz you need that all your nodes to have Type. “dgraph.type: “User”

You could use bulk upsert. But be careful.

upsert {
  query {
    v as var(func: regexp(email, /.*@company1.io$/))
  }

  mutation  {
    set {
      uid(v) <dgraph.type> "Company" .

    }
  }
}
upsert {
  query {
    v as var(func: has(user.name))
  }

  mutation  {
    set {
      uid(v) <dgraph.type> "User" .

    }
  }
}