Panic when importing "A bigger dataset" in tour

I’m working through the Tour. When I get to “A bigger dataset”, I’m having issues loading that data.

If I run:

docker exec -it dgraph dgraph live -f 1million.rdf.gz --alpha localhost:9080 --zero localhost:5080 -c 1

I get the following output:

Running transaction with dgraph endpoint: localhost:9080
Found 1 data file(s) to process
Processing data file "1million.rdf.gz"

panic: interface conversion: interface {} is *time.Time, not time.Time

goroutine 105 [running]:
github.com/dgraph-io/dgraph/tok.YearTokenizer.Tokens(0x18732a0, 0xc0044fd080, 0x10, 0xc0044e0490, 0x14, 0xc0018a3413, 0xc002e00000)
        /tmp/go/src/github.com/dgraph-io/dgraph/tok/tok.go:214 +0x185
github.com/dgraph-io/dgraph/tok.BuildTokens(0x18732a0, 0xc0044fd080, 0x1a864e0, 0x249ab50, 0x1a864e0, 0x249ab50, 0x0, 0x1, 0xc0044e0480)
        /tmp/go/src/github.com/dgraph-io/dgraph/tok/tok.go:104 +0x63
github.com/dgraph-io/dgraph/dgraph/cmd/live.(*loader).conflictKeysForNQuad(0xc0001a2000, 0xc0030dc120, 0x10, 0x2, 0x2, 0x0, 0x0)
        /tmp/go/src/github.com/dgraph-io/dgraph/dgraph/cmd/live/batch.go:309 +0x49e
github.com/dgraph-io/dgraph/dgraph/cmd/live.(*loader).conflictKeysForReq(0xc0001a2000, 0xc002831f88, 0xc00456e001, 0x6a7, 0x800)
        /tmp/go/src/github.com/dgraph-io/dgraph/dgraph/cmd/live/batch.go:330 +0xe9
github.com/dgraph-io/dgraph/dgraph/cmd/live.(*loader).makeRequests(0xc0001a2000)
        /tmp/go/src/github.com/dgraph-io/dgraph/dgraph/cmd/live/batch.go:392 +0x1db
created by github.com/dgraph-io/dgraph/dgraph/cmd/live.setup
        /tmp/go/src/github.com/dgraph-io/dgraph/dgraph/cmd/live/run.go:359 +0x2f0

I’m running the other docker containers as recommended in the Introduction > Run Dgraph section.

Any idea what the problem is? Am I missing something simple?

Nope, let me investigate this. I think there is relation to an old issue. But, due the fact it is from a well known dataset. Need further investigation.

Hey, I found the source of the issue. It is a bug. You can follow here

Also there is a workaround for that in the last comment.

Cheers.

Thanks for the investigation and quick response, Michel! I’ve got the data set loaded now.

There appears to be a schema issue on the next step. But I think I can figure that out or work around it.

This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.