-
Notifications
You must be signed in to change notification settings - Fork 234
feat: tensor type for protobuf deserialization #1645
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
Signed-off-by: Johannes Messner <messnerjo@gmail.com>
Signed-off-by: Johannes Messner <messnerjo@gmail.com>
Signed-off-by: Johannes Messner <messnerjo@gmail.com>
samsja
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
so what happened if I do to_profobuf with a TorchTensor in my doc_type ? It will be loaded as numpy tensor right ?
Signed-off-by: Johannes Messner <messnerjo@gmail.com>
yes, if not |
Signed-off-by: Johannes Messner <messnerjo@gmail.com>
Signed-off-by: Johannes Messner <messnerjo@gmail.com>
Signed-off-by: Johannes Messner <messnerjo@gmail.com>
Signed-off-by: Johannes Messner <messnerjo@gmail.com>
Signed-off-by: Johannes Messner <messnerjo@gmail.com>
|
📝 Docs are deployed on https://ft-feat-protobuf-tensor-type--jina-docs.netlify.app 🎉 |
This allows
DocVocto be deserilzed to a specifictensor_type, i.e. torch, tf, or numpy:Note that the
tensor_typepassed tofrom_protobuf()does not need to match the tensor type from before serialization.Since all tensor are represented the same way in the proto, any proto can be deserialized to any tensor type.
TODO: