2020-07-14 12:51:44 +00:00
|
|
|
import sets
|
|
|
|
|
|
|
|
import serialization
|
|
|
|
export serialization
|
|
|
|
|
2023-03-01 11:19:11 +00:00
|
|
|
import protobuf_serialization/[internal, types, reader, sizer, writer]
|
|
|
|
export types, reader, sizer, writer
|
2020-07-14 12:51:44 +00:00
|
|
|
|
2021-03-19 02:03:17 +00:00
|
|
|
serializationFormat Protobuf
|
|
|
|
|
|
|
|
Protobuf.setReader ProtobufReader
|
|
|
|
Protobuf.setWriter ProtobufWriter, PreferredOutput = seq[byte]
|
2020-07-14 12:51:44 +00:00
|
|
|
|
|
|
|
func supportsInternal[T](ty: typedesc[T], handled: var HashSet[string]) {.compileTime.} =
|
|
|
|
if handled.contains($T):
|
|
|
|
return
|
|
|
|
handled.incl($T)
|
|
|
|
|
|
|
|
verifySerializable(T)
|
|
|
|
|
|
|
|
func supportsCompileTime[T](_: typedesc[T]) =
|
Cleanup / rewrite (#36)
This is a cleanup / rewrite of the implementation increasing
interoperability with other implementations and fixing several bugs /
security issues
* remove special handling of nim:isms (`ptr`, `ref`, `Option`, `HashSet`
etc) (#4)
* these deviate from "standard" protobuf behavior making the library
less interoperable with other langagues
* supporting "custom" types should be done as part of an extension
framework instead (that can support any type / collection instead of a
few hand-picked special cases)
* don't allow encoding scalars in the core encoder (#31)
* `codec` can be used to encode simple scalars
* switch to `libp2p/minprotobuf`-like encoding base, fixing several
bugs, crashes and inaccuracies (#30, #32, #33)
* move parsing support to separate import
* the parser is a heavy dependency
* allow unknown fields
* unknown fields should be given an extension point allowing the user
to detect / handle them - standard behavior for protobuf is to ignore
them
* work around several faststreams bugs (#22)
* remove machine-word-dependent length prefix options (#35)
* actually, remove varint length prefix too for now (due to
faststreams bugs)
* update version
* verify that strings are valid utf-8 on parse
* fix warnings
* truncate values like C++ version
* allow unpacked fields in proto3
* protobuf2/3 -> proto2/3
* update docs
There's lots left to do here in terms of tests and features:
* Almost all tests are roundtrip tests - meaning they check that writing
and reading have the same bugs (vs outputting conforming protobuf)
* There are very few invalid-input tests
* There's a beginning of an extension mechanism, but it needs more work
* It's generally inefficient to copy data to a protobuf object and then
write it to a stream - the stream writers should probably be made more
general to handle this case better (either via callbacks or some other
"builder-like" mechanism - projects currently using `minprotobuf` will
likely see a performance regression using this library
* `required` semantics are still off - a set/not-set flag is needed for
every field in proto2
* possibly, when annotated with proto2, we should simply rewrite all
members to become `PBOption` (as well as rename the field)
2023-01-10 08:07:24 +00:00
|
|
|
when flatType(default(T)) is (object or tuple):
|
2020-07-29 19:34:54 +00:00
|
|
|
var handled = initHashSet[string]()
|
Cleanup / rewrite (#36)
This is a cleanup / rewrite of the implementation increasing
interoperability with other implementations and fixing several bugs /
security issues
* remove special handling of nim:isms (`ptr`, `ref`, `Option`, `HashSet`
etc) (#4)
* these deviate from "standard" protobuf behavior making the library
less interoperable with other langagues
* supporting "custom" types should be done as part of an extension
framework instead (that can support any type / collection instead of a
few hand-picked special cases)
* don't allow encoding scalars in the core encoder (#31)
* `codec` can be used to encode simple scalars
* switch to `libp2p/minprotobuf`-like encoding base, fixing several
bugs, crashes and inaccuracies (#30, #32, #33)
* move parsing support to separate import
* the parser is a heavy dependency
* allow unknown fields
* unknown fields should be given an extension point allowing the user
to detect / handle them - standard behavior for protobuf is to ignore
them
* work around several faststreams bugs (#22)
* remove machine-word-dependent length prefix options (#35)
* actually, remove varint length prefix too for now (due to
faststreams bugs)
* update version
* verify that strings are valid utf-8 on parse
* fix warnings
* truncate values like C++ version
* allow unpacked fields in proto3
* protobuf2/3 -> proto2/3
* update docs
There's lots left to do here in terms of tests and features:
* Almost all tests are roundtrip tests - meaning they check that writing
and reading have the same bugs (vs outputting conforming protobuf)
* There are very few invalid-input tests
* There's a beginning of an extension mechanism, but it needs more work
* It's generally inefficient to copy data to a protobuf object and then
write it to a stream - the stream writers should probably be made more
general to handle this case better (either via callbacks or some other
"builder-like" mechanism - projects currently using `minprotobuf` will
likely see a performance regression using this library
* `required` semantics are still off - a set/not-set flag is needed for
every field in proto2
* possibly, when annotated with proto2, we should simply rewrite all
members to become `PBOption` (as well as rename the field)
2023-01-10 08:07:24 +00:00
|
|
|
supportsInternal(flatType(default(T)), handled)
|
2020-07-14 12:51:44 +00:00
|
|
|
|
|
|
|
func supports*[T](_: type Protobuf, ty: typedesc[T]): bool =
|
Cleanup / rewrite (#36)
This is a cleanup / rewrite of the implementation increasing
interoperability with other implementations and fixing several bugs /
security issues
* remove special handling of nim:isms (`ptr`, `ref`, `Option`, `HashSet`
etc) (#4)
* these deviate from "standard" protobuf behavior making the library
less interoperable with other langagues
* supporting "custom" types should be done as part of an extension
framework instead (that can support any type / collection instead of a
few hand-picked special cases)
* don't allow encoding scalars in the core encoder (#31)
* `codec` can be used to encode simple scalars
* switch to `libp2p/minprotobuf`-like encoding base, fixing several
bugs, crashes and inaccuracies (#30, #32, #33)
* move parsing support to separate import
* the parser is a heavy dependency
* allow unknown fields
* unknown fields should be given an extension point allowing the user
to detect / handle them - standard behavior for protobuf is to ignore
them
* work around several faststreams bugs (#22)
* remove machine-word-dependent length prefix options (#35)
* actually, remove varint length prefix too for now (due to
faststreams bugs)
* update version
* verify that strings are valid utf-8 on parse
* fix warnings
* truncate values like C++ version
* allow unpacked fields in proto3
* protobuf2/3 -> proto2/3
* update docs
There's lots left to do here in terms of tests and features:
* Almost all tests are roundtrip tests - meaning they check that writing
and reading have the same bugs (vs outputting conforming protobuf)
* There are very few invalid-input tests
* There's a beginning of an extension mechanism, but it needs more work
* It's generally inefficient to copy data to a protobuf object and then
write it to a stream - the stream writers should probably be made more
general to handle this case better (either via callbacks or some other
"builder-like" mechanism - projects currently using `minprotobuf` will
likely see a performance regression using this library
* `required` semantics are still off - a set/not-set flag is needed for
every field in proto2
* possibly, when annotated with proto2, we should simply rewrite all
members to become `PBOption` (as well as rename the field)
2023-01-10 08:07:24 +00:00
|
|
|
# TODO return false when not supporting, instead of crashing compiler
|
2020-07-14 12:51:44 +00:00
|
|
|
static: supportsCompileTime(T)
|
2020-07-29 19:34:54 +00:00
|
|
|
true
|
2023-03-01 11:19:11 +00:00
|
|
|
|
|
|
|
func computeSize*[T: object](_: type Protobuf, value: T): int =
|
|
|
|
## Return the encoded size of the given value
|
|
|
|
computeObjectSize(value)
|