Dgraph September Community Call

Register here

We are super excited to announce our second community call on 2020-09-24T07:00:00Z 9 AM Pacific Time. We are honored to host The New York Times’ Doug Donohoe. He will give a 30 min talk about how The New York Times reordered 57 million events in-memory using BadgerDB (with code and lessons learned). That’s every version of every New York Times article since 1851 and related assets (e.g. images, tags).

This call will also feature Dgraph’s CEO @mrjn and the Dgraph team who will be there to answer your questions. Please submit your questions on this thread.

Agenda:

9 AM - 9:30 AM: Doug Donohoe from The New York Times
9:30 AM - 9:40 AM: Q&A with Doug Donohoe
9:40 AM - 10:30 AM: Q&A with the Dgraph Team

About Doug Donohoe:

Twitter: https://twitter.com/DougDonohoe
LinkedIn: Doug Donohoe - Pittsburgh, Pennsylvania, United States | Professional Profile | LinkedIn

Senior Software Engineer at The New York Times

Doug Donohoe (he/him/his) lives in Pittsburgh, Pennsylvania, working remotely for the New York Times as a Senior Software Engineer on the Publishing Pipeline team. Doug has created e-commerce, poker, high-frequency trading, ad-tech, healthcare, fintech, and publishing software using Scala, Java, Python, Groovy, C, C++, and most recently Go. He enjoys traveling and hiking with his wife Cindy, has been to all seven continents, and likes turtles.

Recommended reading: Reordering 57 Million New York Times Publishing Assets Using Go and BadgerDB

Recording

Hi Doug, thanks for the presentation. When NYT publicized the monolog design a few years back, it drew a lot of discussion. Looking back, do you think it’s still the best way to represent the chronology of NYT’s publications?

3 Likes

Hi Doug, that’s a great presentation. Really, loved the wrapper for looking into the WriteBatch(). I want to ask was there any part of the project with badger when you relatively struggled with Badger and maybe wanted some things to be handled by badger itself. Hoping it would be smoother for the rest of the part.
Thanks.

2 Likes

Interesting presentation. Have you looked at any uses of DGraph at NYT? It seems like the digital assets could be well-represented in a graph database.

1 Like

Have you considered using dgraph to manage your ontology and queries? What are your thoughts on storing data in a common schema like Schema.org?

1 Like

So did you finally manage to run the entire workload of 59 million records on your personal laptop? How long did the entire process take finally.

2 Likes

I’d like to hear from the team about the high-cardinality question that Doug posed…

1 Like

Hi dgraph team. Earlier this year there was work on multitenancy, and in the summer there was a public RFC- but it seems to have been sidelined. Does the plan still match the RFC, and is there a timeline? Thanks!

1 Like

Just to confirm while we build our apps using dgraph - is DQL (graphql±) going to be supported indefinitely vs the graphql API?

1 Like

Change data capture was mentioned- this would be a great feature for us. I think it was on the roadmap at one point, is it still planned?

1 Like

Is there a place to create a client library that runs in the browser for vanilla graphql & graphql +/- that uses grpc instead of http?

I would like to build in React Native with a library that will give me the most performance with Dgraph.

2 Likes