Feels like your DB got corrupted due a disk or RAM issue. You should use HA and give more resources to your cluster. The best you can do now is try to export your dataset and create a new cluster.
But this is not happening for all nodes. very few. so couldn’t figure what went wrong. Also im not able to delete that node with uid. same error is popping up
Do what I said. It will stop.
You may have your cluster compromised due some corruption. It may the timestamps, the data itself, or the data in the disk have a corruption due some disk failure. If you have HA this wont happen.
If you give low resources to your cluster, mean you don’t give a correct dimensioning to your cluster, it will end up like that. That best way to recover from that is exporting and doing it right.
Hi, We upgraded our db to latest version 22.0.2, but after trying to import data from rdf file getting below error. how to move forward.
2023/02/14 10:56:30 RDF doesn’t match schema: Input for predicate “Country” of type scalar is uid. Edge: entity:966178 attr:"\000\000\000\000\000\000\000\000Country" value_type:UID value_id:6338232
root@19e11662e25f:/dgraph#
Not able to do query as the above-given error is failing the data import. this error is coming while trying to import the rdf and schema file in docker with a command in linux like below.
Yeah. The problem relies in that data "RDF doesn’t match schema". Have you added a schema before loading that data? if so, you previous schema sets the Country UID not a value.
Check your schema and take a look at your data. If this is just a test env. Delete all data in Dgraph’s paths and load that data again. This looks like Dgraph’s dataset that you may have found in our repos. If from a clean load it still gives that error. I gonna check if the dataset is ok or need to be edited.
Have you tested what I said? If so, I can update the dataset. But if you wanna do it. You can just unzip it, open in any IDE and use regex to find the problematic part. Either be in the schema or the dataset.
Thanks, @MichelDiz we were able to bulk-load the data by editing our rdf file. But after a successful load, we are still not able to see our data in the ratel. We checked our docker out and p directories are created.
We also tried doing a live load after this but we are getting context deadline exceeded. We tried both using localhost and docker network IP. Also when we imported data, we saw the schema file is empty.