Again, almost all indexes are based on words and stopwords. So, you won’t query for symbols.
For a single asterisk, you can use Hash.
Maybe you could request an index that can index symbols too. I think such an index would be bigger and consume more resources to process. So it makes sense to be a separated one.
Hmm, this is very confusing because when i change text to text with 1 in middle I can now search 1 with same query. And 1 has in UTF-8 same size like *. There are no special requirements for indexing, in my opinion.
{
all(func: has(text)) {
uid
text
}
w(func: anyofterms(text,"1")) {
uid
text
}
}
results:
{
"data": {
"all": [
{
"uid": "0x2",
"text": "a"
},
{
"uid": "0x3",
"text": "*"
},
{
"uid": "0x4",
"text": "text with 1 in middle"
}
],
"w": [
{
"uid": "0x4",
"text": "text with 1 in middle"
}
]
}
}
So, what I should say to our customers which have business use case where they want to find * in text?
Should I say that SlashQL is not able to find simple * and that they are not human because they want to find something which are not usefull for humans?
SlashQL not allow to install custom tokenizer.
And this is really simple task in MongoDB, Elastic and also SQL DB.
We’ll get this looked at and get back to you. v1.0.x is a pretty old cluster so the code has changed quite a lot since then but I agree this should work with the term tokenizer.