Conversation
I did this because many of the binary decoders were incorrect. - Remove existing binary decoder tests and replace with integration tests against postgres (To verify that they do actually decode the actual PG binary representation) - Update existing binary decoders for date/timestamp/time, string, and json - Add hstore decoder - Add numeric decoder
Current coverage is 83.13%@@ master #72 diff @@
==========================================
Files 18 18
Lines 760 836 +76
Methods 748 819 +71
Messages 0 0
Branches 12 17 +5
==========================================
+ Hits 661 695 +34
- Misses 99 141 +42
Partials 0 0
|
|
|
||
| implicit val stringElementDecoder: ElementDecoder[String] = new ElementDecoder[String] { | ||
| def textDecoder(text: String): String = text | ||
| def binaryDecoder(bytes: Array[Byte]): String = bytes.map(_.toChar).mkString |
There was a problem hiding this comment.
Nice catch. My brain wasn't working when I originally wrote that for some reason.
There was a problem hiding this comment.
The binary decoder tests only tested some arbitrary notion of what they should decode. That's why I removed them in favor of integration tests which test against what Postgres is actually going to send.
There was a problem hiding this comment.
Yeah, I missed a decent portion of these. I read the documentation, then went to look at the actual "authorized" Postgres JDBC driver, and the only portion of the code I saw was for dealing w/ the standard Short / Int / Long / Float / Double Big Endian format. If I recall ( from somewhere ), I put some of the other "binary" decoders out as the documentation specified that if no Binary version was found, it was assumed to be the Text Version.
There was a problem hiding this comment.
Yeah, for binary decoders you pretty much have to look at the postgres source itself and port it from C to Scala. These were built by porting the *recv functions from backend/utils/adt in the postgres source tree.
| val buf = ByteBuffer.wrap(bytes) | ||
| Xor.catchNonFatal { | ||
| val count = buf.getInt() | ||
| val charset = StandardCharsets.UTF_8 //TODO: are we going to support other charsets? |
There was a problem hiding this comment.
Yes, we will at some point, but I was putting off updating the current Charset in the Client Dispatcher until I had figured out how I wanted to represent the Connection State Machine.
I just opened #73 for a discussion on it.
There was a problem hiding this comment.
finagle-postgres has a pretty battle-tested state machine. I know its other code is somewhat dated/non-idiomatic but the state machine seems pretty robust.
There was a problem hiding this comment.
Yeah, most of the current machines were written after not so much a copy paste, but a heavy look over from that code base.
There was a problem hiding this comment.
I think some benefit could be had from bringing over its state machine - it supports a lot of features currently missing from roc (like the parse-bind-execute query flow)
|
All of this looks great! I'm super excited to accept a PR for this :) . If you can delete those unit tests that aren't being called, I think this is ready to go. |
I did this because many of the binary decoders were incorrect.
(To verify that they do actually decode the actual PG binary representation)