From facd89ea68bb55d95395b9f6c289f7fc32b8d105 Mon Sep 17 00:00:00 2001 From: Greg Zoller Date: Thu, 25 Jan 2024 23:37:55 -0600 Subject: [PATCH] First tentative steps into reading --- benchmark/README.md | 68 +-- benchmark/build.sbt | 10 +- .../src/main/scala/co.blocke/Benchmark.scala | 12 +- benchmark/src/main/scala/co.blocke/Run.scala | 24 +- .../json/{ => exp}/ByteArrayAccess.java | 2 +- .../scala/co.blocke.scalajack/ScalaJack.scala | 23 +- .../co.blocke.scalajack/json/JsonCodec.scala | 9 +- .../co.blocke.scalajack/json/JsonError.scala | 18 +- .../json/exp/JsonReader.scala | 440 ++++++++++++++++++ .../json/exp/JsonReaderException.scala | 6 + .../json/exp/package.scala | 15 + .../json/reading/ClassDecoder.scala | 15 + .../{JsonReader.scala => JsonReader.scalax} | 9 +- .../json/reading/JsonReader.scalax2 | 114 +++++ .../json/reading/JsonSource.scala | 128 ++++- .../json/writing/JsonCodecMaker.scala | 166 +++++-- .../scala/co.blocke.scalajack/run/Play.scala | 97 ++-- 17 files changed, 1006 insertions(+), 150 deletions(-) rename src/main/java/co/blocke/scalajack/json/{ => exp}/ByteArrayAccess.java (97%) create mode 100644 src/main/scala/co.blocke.scalajack/json/exp/JsonReader.scala create mode 100644 src/main/scala/co.blocke.scalajack/json/exp/JsonReaderException.scala create mode 100644 src/main/scala/co.blocke.scalajack/json/exp/package.scala rename src/main/scala/co.blocke.scalajack/json/reading/{JsonReader.scala => JsonReader.scalax} (90%) create mode 100644 src/main/scala/co.blocke.scalajack/json/reading/JsonReader.scalax2 diff --git a/benchmark/README.md b/benchmark/README.md index 5405fb5f..a7420b02 100644 --- a/benchmark/README.md +++ b/benchmark/README.md @@ -1,12 +1,12 @@ # Performance -JSON serialization benchmarks I found in various repos often measured (IMO) silly things like how fast -a parser could handle a small list of Int. For this benchmark I used a more substantial model + JSON. +JSON serialization benchmarks I found in various project repos often measured (IMO) silly things like how fast +a parser could handle a small list of Int. For this benchmark I used a slightly more substantial model. It's still a small model, but it does have some nested objects and collections that make it a more -realistic test. +interesting test. -The test is run via jmh, a common and accepted benchmarking tool. The JVM is **stock**--not tuned to -within an inch of its life, again to be a more realistic use case. +The test is run via jmh. The JVM is **stock**--not tuned to within an inch of its life, to be a more realistic +use case. Run benchmark from the ScalaJack/benchmark directory (not the main ScalaJack project directory): ``` @@ -40,60 +40,72 @@ sbt "jmh:run -i 10 -wi 10 -f 2 -t 1 co.blocke.*" | Argonaut | thrpt | 20 | 690269.697 | ± 6348.882 | ops/s | | Play JSON | thrpt | 20 | 438650.022 | ± 23800.221 | ops/s | -**Note:** Exact numbers aren't terribly important--they will vary depending on the platform +**Note:** Exact numbers aren't terribly important--they may vary widely depending on the platform used. The important thing is the relative relationship between libraries given all tests were performed on the same platform. ### Interpretation -Performance for ScalaJack has been a journey. ScalaJack is a mature product, and while it -was once (a long time ago) quite fast vs its competition, its performance has lagged -considerably. ScalaJack 8 changes that! +Performance for ScalaJack has been a journey. ScalaJack is a mature product--over 10 yrs old, +can you believe it?. Long ago it was quite fast vs its competition. Over the years though, its +performance has lagged considerably, to the point that it was one of the slower serialization +libraries. ScalaJack 8 changes that! I was sampling and testing against a collection of popular serializers for Scala util something quite unexpected happend. When I tested Jsoniter, its performance was through -the roof! Even faster than hand-tooled code. This was a shock. I had to learn how this -worked. +the roof! It far outpaced all competitors for raw speed. This was a shock. I had to +learn how this worked. So full credit where credit is due: ScalaJack 8's reading/writing codec architecture -is heavily derived from Jsoniter. +is heavily informed from Jsoniter, so I'll post their licence here: [Jsoniter's License](https://github.com/plokhotnyuk/jsoniter-scala/blob/af23cf65a70d48834b8fecb792cc333b23409c6f/LICENSE) There are a number of optimizations and design choices I elected not to bring over from -Jsoniter, and of course ScalaJack utilizes our own scala-reflection library to great effect. +Jsoniter, in many cases because ScalaJack doesn't need them for its intended feature set. +Of course ScalaJack utilizes our own macro-driven scala-reflection library to great effect, +which Jsoniter does not. -Jsoniter, it turns out, achieves its neck-breaking speed by going deep--very deep. They -use a lot of low level byte arrays and bitwise operators, much as you'd expect to see in -a C program, to improve on the standard library functions everyone else uses. It works. +Jsoniter achieves its neck-breaking speed by going deep--very deep into macro code +generation. They also use a lot of low level byte arrays and bitwise operators, much as you'd +expect to see in a C program, to improve on the standard library functions everyone else uses. +It works. ScalaJack's focus is first and foremost to be frictionless--no drama to the user. ScalaJack requires zero boilerplate--you can throw any Scala object (or even a Java object) at it with no pre-preparation -and it will serialize it. For its intended use-cases, ScalaJack offers excellent performance, equal -to or exceeding a number of widely-used alternative choices. +and it will serialize it. For its intended use-cases, out-of-the-box ScalaJack 8 offers excellent +performance, equal to or exceeding a number of widely-used alternative choices. + +If you're willing to suffer just 1 single line of boilerplate, ScalaJack 8 will reward you with +speed that's in the top one or two of its class ("fast mode" in the results). ### Technical Notes -Achieving extreme speed for ScalaJack was weeks of learning, trial, error, -and re-writes. I studied Jsoniter, Circe, and ZIO Json, and others to learn optimizations. +Achieving extreme speed for ScalaJack 8 was several weeks of learning, trial, error, +and re-writes. I studied Jsoniter, Circe, ZIO Json, and others to learn optimizations. The tough news for anyone wanting to duplicate this kind of performance in your own code is that there isn't one magic trick to achieve maximum performance. It's a basket -of techniques, each achieving marginal gains that add up, and you must decide when enough -is enough. Here's a partial list of learnings incorporated into ScalaJack: +of techniques, each achieving small marginal gains that add up, and you must decide when +enough is enough for you. Here's a partial list of learnings incorporated into ScalaJack 8: * Being careful when using .asInstanceOf[]... in fact try to avoid it wherever possible - as it messes up CPU cache harming performance. This means a lot of very careful type + as it messes up CPU cache, harming performance. This means a lot of very careful type management, and its why you see the RTypeRefs from scala-reflection are now all typed in the latest version -* Lots of specific typing. Don't make the computer think--provide detailed types wherever +* Lots of specific typing. Don't make the compiler think--provide detailed types wherever you can * For macro-based software like this--find every opportunity to do hard work at compile-time * Be mindful of what code your macros generate! You can paint by the numbers with quotes and - splices, like the documentaion and blogs suggest, and you'll get something working. When you - examine the code this produces, you may be disappointed. If it looks kludgy it will be slow--rework - your macros until the code is smooth. For fastest performance you'll actually have to generate - custom functions as shown in ScalaJack's code (look at JsonCodecMaker.scala) + splices, like the documentaion and blogs suggest, and you will get something working. + When you examine the code a "stock" macro use produces, you may be disappointed + if ultimate runtime speed is your goal. Then generated code might look a litle kludgy, and + it will not necessarily be speed optimized. Rework your macros carefully until the generated code + is as smooth as you might write by hand. Remember: your macro code doesn't have to win awards for + style or beauty--your generated code does! For the fastest performance you'll actually have + to generate custom functions as shown in ScalaJack's code (look at JsonCodecMaker.scala) This + isn't for the faint of heart. If it all looks like Greek, step back and layer yourself into + macros slowly a piece at a time. diff --git a/benchmark/build.sbt b/benchmark/build.sbt index c5af1411..45d16189 100644 --- a/benchmark/build.sbt +++ b/benchmark/build.sbt @@ -36,15 +36,17 @@ lazy val benchmark = project libraryDependencies ++= Seq( "org.playframework" %% "play-json" % "3.0.1", "io.argonaut" %% "argonaut" % "6.3.9", - "co.blocke" %% "scalajack" % "fc0b25_unknown", - "co.blocke" %% "scala-reflection" % "sj_fixes_edbef8", + "co.blocke" %% "scalajack" % "3a3001_unknown", + "co.blocke" %% "scala-reflection" % "sj_fixes_f43af7", "dev.zio" %% "zio-json" % "0.6.1", "org.typelevel" %% "fabric-core" % "1.12.6", "org.typelevel" %% "fabric-io" % "1.12.6", "org.typelevel" %% "jawn-parser" % "1.3.2", "org.typelevel" %% "jawn-ast" % "1.3.2", - "com.github.plokhotnyuk.jsoniter-scala" %% "jsoniter-scala-core" % "2.24.4", - "com.github.plokhotnyuk.jsoniter-scala" %% "jsoniter-scala-macros" % "2.24.4" % "compile-internal", + "com.github.plokhotnyuk.jsoniter-scala" %% "jsoniter-scala-core" % "2.24.5-SNAPSHOT", + "com.github.plokhotnyuk.jsoniter-scala" %% "jsoniter-scala-macros" % "2.24.5-SNAPSHOT" % "compile-internal", + // "com.github.plokhotnyuk.jsoniter-scala" %% "jsoniter-scala-core" % "2.24.4", + // "com.github.plokhotnyuk.jsoniter-scala" %% "jsoniter-scala-macros" % "2.24.4" % "compile-internal", // "io.circe" %% "circe-derivation" % "0.15.0-M1", // "io.circe" %% "circe-jackson29" % "0.14.0", // "org.json4s" %% "json4s-jackson" % "4.0.4", diff --git a/benchmark/src/main/scala/co.blocke/Benchmark.scala b/benchmark/src/main/scala/co.blocke/Benchmark.scala index 045faa89..5beb8197 100644 --- a/benchmark/src/main/scala/co.blocke/Benchmark.scala +++ b/benchmark/src/main/scala/co.blocke/Benchmark.scala @@ -43,12 +43,12 @@ trait HandTooledWritingBenchmark { @BenchmarkMode(Array(Mode.Throughput)) @OutputTimeUnit(TimeUnit.SECONDS) class ReadingBenchmark - // extends CirceZ.CirceReadingBenchmark - extends ScalaJackZ.ScalaJackReadingBenchmark - // with JsoniterZ.JsoniterReadingBenchmark - // with ZIOZ.ZIOJsonReadingBenchmark - // with PlayZ.PlayReadingBenchmark - // with ArgonautZ.ArgonautReadingBenchmark + extends CirceZ.CirceReadingBenchmark + with ScalaJackZ.ScalaJackReadingBenchmark + with JsoniterZ.JsoniterReadingBenchmark + with ZIOZ.ZIOJsonReadingBenchmark + with PlayZ.PlayReadingBenchmark + with ArgonautZ.ArgonautReadingBenchmark @State(Scope.Thread) @BenchmarkMode(Array(Mode.Throughput)) diff --git a/benchmark/src/main/scala/co.blocke/Run.scala b/benchmark/src/main/scala/co.blocke/Run.scala index 33451f47..f359887a 100644 --- a/benchmark/src/main/scala/co.blocke/Run.scala +++ b/benchmark/src/main/scala/co.blocke/Run.scala @@ -15,12 +15,30 @@ object RunMe extends App: // deriveEncoder[Record] // } - co.blocke.scalajack.internal.CodePrinter.code { - given codec: JsonValueCodec[Record] = JsonCodecMaker.make - } + // import co.blocke.scalajack.* + // import ScalaJack.* + // implicit val blah: ScalaJack[Record] = sj[Record] + // println(ScalaJack[Record].fromJson(jsData)) + + + // co.blocke.scalajack.internal.CodePrinter.code { + // given codec: JsonValueCodec[Record] = JsonCodecMaker.make + // } // given codec: JsonValueCodec[Record] = JsonCodecMaker.make // println(readFromString[Record](jsData)) // println(writeToString(record)) + import com.github.plokhotnyuk.jsoniter_scala.core._ + import com.github.plokhotnyuk.jsoniter_scala.macros._ + + given codec: JsonValueCodec[Record] = JsonCodecMaker.make + println(readFromString[Record](jsData)) + // } + + // trait JsoniterWritingBenchmark{ + // @Benchmark + // def writeRecordJsoniter = writeToString(record) + // } + println("\nDone") \ No newline at end of file diff --git a/src/main/java/co/blocke/scalajack/json/ByteArrayAccess.java b/src/main/java/co/blocke/scalajack/json/exp/ByteArrayAccess.java similarity index 97% rename from src/main/java/co/blocke/scalajack/json/ByteArrayAccess.java rename to src/main/java/co/blocke/scalajack/json/exp/ByteArrayAccess.java index 99771032..d85cc1ad 100644 --- a/src/main/java/co/blocke/scalajack/json/ByteArrayAccess.java +++ b/src/main/java/co/blocke/scalajack/json/exp/ByteArrayAccess.java @@ -1,4 +1,4 @@ -package co.blocke.scalajack.json; +package co.blocke.scalajack.json.exp; import java.lang.invoke.MethodHandles; import java.lang.invoke.VarHandle; diff --git a/src/main/scala/co.blocke.scalajack/ScalaJack.scala b/src/main/scala/co.blocke.scalajack/ScalaJack.scala index cebe8c02..e8258682 100644 --- a/src/main/scala/co.blocke.scalajack/ScalaJack.scala +++ b/src/main/scala/co.blocke.scalajack/ScalaJack.scala @@ -7,6 +7,15 @@ import scala.quoted.* import quoted.Quotes import json.* +case class ScalaJack[T](jsonCodec: JsonCodec[T]): // extends JsonCodec[T] //with YamlCodec with MsgPackCodec + def fromJson(js: String): T = // Either[JsonParseError, T] = + jsonCodec.decodeValue(reading.JsonSource(js)) + + val out = writing.JsonOutput() // let's clear & re-use JsonOutput--avoid re-allocating all the internal buffer space + def toJson(a: T): String = + jsonCodec.encodeValue(a, out.clear()) + out.result +/* case class ScalaJack[T](jsonDecoder: reading.JsonDecoder[T], jsonEncoder: JsonCodec[T]): // extends JsonCodec[T] //with YamlCodec with MsgPackCodec def fromJson(js: String): Either[JsonParseError, T] = jsonDecoder.decodeJson(js) @@ -15,6 +24,7 @@ case class ScalaJack[T](jsonDecoder: reading.JsonDecoder[T], jsonEncoder: JsonCo def toJson(a: T): String = jsonEncoder.encodeValue(a, out.clear()) out.result + */ // --------------------------------------- @@ -27,10 +37,11 @@ object ScalaJack: def sjImpl[T: Type](using Quotes): Expr[ScalaJack[T]] = import quotes.reflect.* val classRef = ReflectOnType[T](quotes)(TypeRepr.of[T], true)(using scala.collection.mutable.Map.empty[TypedName, Boolean]) - val jsonDecoder = reading.JsonReader.refRead(classRef) - val jsonEncoder = writing.JsonCodecMaker.generateCodecFor(classRef, JsonConfig) + // val jsonDecoder = reading.JsonReader.refRead2(classRef) + // println(s"Decoder: ${jsonDecoder.show}") + val jsonCodec = writing.JsonCodecMaker.generateCodecFor(classRef, JsonConfig) - '{ ScalaJack($jsonDecoder, $jsonEncoder) } + '{ ScalaJack($jsonCodec) } // ----- Use given JsonConfig inline def sj[T](inline cfg: JsonConfig): ScalaJack[T] = ${ sjImplWithConfig[T]('cfg) } @@ -38,9 +49,9 @@ object ScalaJack: import quotes.reflect.* val cfg = summon[FromExpr[JsonConfig]].unapply(cfgE) val classRef = ReflectOnType[T](quotes)(TypeRepr.of[T], true)(using scala.collection.mutable.Map.empty[TypedName, Boolean]) - val jsonDecoder = reading.JsonReader.refRead(classRef) - val jsonEncoder = writing.JsonCodecMaker.generateCodecFor(classRef, cfg.getOrElse(JsonConfig)) - '{ ScalaJack($jsonDecoder, $jsonEncoder) } + // val jsonDecoder = reading.JsonReader.refRead2(classRef) + val jsonCodec = writing.JsonCodecMaker.generateCodecFor(classRef, cfg.getOrElse(JsonConfig)) + '{ ScalaJack($jsonCodec) } // refRead[T](classRef) diff --git a/src/main/scala/co.blocke.scalajack/json/JsonCodec.scala b/src/main/scala/co.blocke.scalajack/json/JsonCodec.scala index a8d07bce..200ccc45 100644 --- a/src/main/scala/co.blocke.scalajack/json/JsonCodec.scala +++ b/src/main/scala/co.blocke.scalajack/json/JsonCodec.scala @@ -2,14 +2,13 @@ package co.blocke.scalajack package json import writing.* +import reading.* trait JsonCodec[A] { - // TBD... when we're ready to tackle reading! - // def decodeValue(in: JsonReader, default: A): A = ${ - // if (cfg.encodingOnly) '{ ??? } - // else genReadVal(rootTpe :: Nil, 'default, cfg.isStringified, false, 'in) - // } + // def decodeValue(in: JsonReader, default: A): A = + // ${ genReadVal(rootTpe :: Nil, 'default, cfg.isStringified, false, 'in) } def encodeValue(in: A, out: JsonOutput): Unit + def decodeValue(in: JsonSource): A } diff --git a/src/main/scala/co.blocke.scalajack/json/JsonError.scala b/src/main/scala/co.blocke.scalajack/json/JsonError.scala index d10e0a59..f790f1d0 100644 --- a/src/main/scala/co.blocke.scalajack/json/JsonError.scala +++ b/src/main/scala/co.blocke.scalajack/json/JsonError.scala @@ -1,21 +1,23 @@ package co.blocke.scalajack package json -class JsonIllegalKeyType(msg: String) extends Throwable(msg) -class JsonNullKeyValue(msg: String) extends Throwable(msg) -class JsonUnsupportedType(msg: String) extends Throwable(msg) -class JsonConfigError(msg: String) extends Throwable(msg) -class JsonEitherLeftError(msg: String) extends Throwable(msg) +import scala.util.control.NoStackTrace -class ParseError(val msg: String) extends Throwable(msg): +class JsonIllegalKeyType(msg: String) extends Throwable(msg) with NoStackTrace +class JsonNullKeyValue(msg: String) extends Throwable(msg) with NoStackTrace +class JsonUnsupportedType(msg: String) extends Throwable(msg) with NoStackTrace +class JsonConfigError(msg: String) extends Throwable(msg) with NoStackTrace +class JsonEitherLeftError(msg: String) extends Throwable(msg) with NoStackTrace + +class ParseError(val msg: String) extends Throwable(msg) with NoStackTrace: val show: String = "" // Thrown at compile-time only! -case class JsonTypeError(override val msg: String) extends ParseError(msg): +case class JsonTypeError(override val msg: String) extends ParseError(msg) with NoStackTrace: override val show: String = "" // Thrown at runtime only! -case class JsonParseError(override val msg: String, context: reading.JsonSource) extends ParseError(msg + " at position " + context.pos): +case class JsonParseError(override val msg: String, context: reading.JsonSource) extends ParseError(msg + " at position " + context.pos) with NoStackTrace: override val show: String = val js = context.js.toString val (clip, dashes) = context.pos match { diff --git a/src/main/scala/co.blocke.scalajack/json/exp/JsonReader.scala b/src/main/scala/co.blocke.scalajack/json/exp/JsonReader.scala new file mode 100644 index 00000000..bb5ad981 --- /dev/null +++ b/src/main/scala/co.blocke.scalajack/json/exp/JsonReader.scala @@ -0,0 +1,440 @@ +package co.blocke.scalajack +package json +package exp + +import scala.annotation.{switch, tailrec} +import java.nio.ByteBuffer +import java.io.InputStream +import java.nio.charset.StandardCharsets.UTF_8 +import scala.specialized + +/* +Bottom Line: + Fancy string reading, ByteBuffer handling, etc. did NOT have a material effect on speed. + The SJ way was either equal to, or faster than, the JsonReader approach for reading strings! +*/ + +class JsonReader private[json]( + private[this] var buf: Array[Byte] = new Array[Byte](32768), + private[this] var head: Int = 0, + private[this] var tail: Int = 0, + private[this] var mark: Int = -1, + private[this] var charBuf: Array[Char] = new Array[Char](4096), + private[this] var bbuf: ByteBuffer = null, + private[this] var in: InputStream = null, + private[this] var totalRead: Long = 0 +): + + private[json] def read(s: String): String = { + val currBuf = this.buf + try { + this.buf = s.getBytes(UTF_8) + head = 0 + val to = buf.length + tail = to + totalRead = 0 + mark = -1 + readString("") + } finally { + this.buf = currBuf + } + } + + def readString(default: String): String = + if (isNextToken('"', head)) { + val pos = head + val len = parseString(0, Math.min(tail - pos, charBuf.length), charBuf, pos) + new String(charBuf, 0, len) + } else readNullOrTokenError(default, '"') + + @tailrec + private[this] def isNextToken(t: Byte, pos: Int): Boolean = + if (pos < tail) { + val b = buf(pos) + head = pos + 1 + b == t || ((b == ' ' || b == '\n' || (b | 0x4) == '\r') && nextToken(pos + 1) == t) + } else isNextToken(t, loadMoreOrError(pos)) + + @tailrec + private[this] def nextToken(pos: Int): Byte = + if (pos < tail) { + val b = buf(pos) + if (b == ' ' || b == '\n' || (b | 0x4) == '\r') nextToken(pos + 1) + else { + head = pos + 1 + b + } + } else nextToken(loadMoreOrError(pos)) + + @tailrec + private[this] def readNullOrTokenError[@specialized A](default: A, t: Byte): A = + if (default != null) { + val pos = head + if (pos != 0) { + if (pos + 2 < tail) { + val bs = ByteArrayAccess.getInt(buf, pos - 1) + if (bs == 0x6C6C756E) { + head = pos + 3 + default + } else tokenOrNullError(t, bs, pos) + } else if (buf(pos - 1) == 'n') { + head = loadMoreOrError(pos - 1) + 1 + readNullOrTokenError(default, t) + } else tokenOrNullError(t) + } else illegalTokenOperation() + } else tokenError(t) + + @tailrec + private[this] def parseString(i: Int, minLim: Int, charBuf: Array[Char], pos: Int): Int = + if (i + 3 < minLim) { // Based on SWAR routine of JSON string parsing: https://github.com/sirthias/borer/blob/fde9d1ce674d151b0fee1dd0c2565020c3f6633a/core/src/main/scala/io/bullet/borer/json/JsonParser.scala#L456 + val bs = ByteArrayAccess.getInt(buf, pos) + val m = ((bs - 0x20202020 ^ 0x3C3C3C3C) - 0x1010101 | (bs ^ 0x5D5D5D5D) + 0x1010101) & 0x80808080 + charBuf(i) = (bs & 0xFF).toChar + charBuf(i + 1) = (bs >> 8 & 0xFF).toChar + charBuf(i + 2) = (bs >> 16 & 0xFF).toChar + charBuf(i + 3) = (bs >> 24).toChar + if (m != 0) { + val offset = java.lang.Integer.numberOfTrailingZeros(m) >> 3 + if ((bs >> (offset << 3)).toByte == '"') { + head = pos + offset + 1 + i + offset + } else parseEncodedString(i + offset, charBuf.length - 1, charBuf, pos + offset) + } else parseString(i + 4, minLim, charBuf, pos + 4) + } else if (i < minLim) { + val b = buf(pos) + charBuf(i) = b.toChar + if (b == '"') { + head = pos + 1 + i + } else if ((b - 0x20 ^ 0x3C) <= 0) parseEncodedString(i, charBuf.length - 1, charBuf, pos) + else parseString(i + 1, minLim, charBuf, pos + 1) + } else if (pos >= tail) { + val newPos = loadMoreOrError(pos) + parseString(i, Math.min(charBuf.length, i + tail - newPos), charBuf, newPos) + } else parseString(i, Math.min(growCharBuf(i + 1), i + tail - pos), this.charBuf, pos) + + @tailrec + private[this] def parseEncodedString(i: Int, lim: Int, charBuf: Array[Char], pos: Int): Int = { + val remaining = tail - pos + if (i < lim) { + if (remaining > 0) { + val b1 = buf(pos) + if (b1 >= 0) { + if (b1 == '"') { + head = pos + 1 + i + } else if (b1 != '\\') { // 0aaaaaaa (UTF-8 byte) -> 000000000aaaaaaa (UTF-16 char) + if (b1 < ' ') unescapedControlCharacterError(pos) + charBuf(i) = b1.toChar + parseEncodedString(i + 1, lim, charBuf, pos + 1) + } else if (remaining > 1) { + val b2 = buf(pos + 1) + if (b2 != 'u') { + charBuf(i) = (b2: @switch) match { + case '"' => '"' + case 'n' => '\n' + case 'r' => '\r' + case 't' => '\t' + case 'b' => '\b' + case 'f' => '\f' + case '\\' => '\\' + case '/' => '/' + case _ => illegalEscapeSequenceError(pos + 1) + } + parseEncodedString(i + 1, lim, charBuf, pos + 2) + } else if (remaining > 5) { + val ch1 = readEscapedUnicode(pos + 2, buf) + charBuf(i) = ch1 + if (ch1 < 0xD800 || ch1 > 0xDFFF) parseEncodedString(i + 1, lim, charBuf, pos + 6) + else if (remaining > 11) { + if (buf(pos + 6) != '\\') illegalEscapeSequenceError(pos + 6) + if (buf(pos + 7) != 'u') illegalEscapeSequenceError(pos + 7) + val ch2 = readEscapedUnicode(pos + 8, buf) + if (ch1 >= 0xDC00 || ch2 < 0xDC00 || ch2 > 0xDFFF) decodeError("illegal surrogate character pair", pos + 11) + charBuf(i + 1) = ch2 + parseEncodedString(i + 2, lim, charBuf, pos + 12) + } else parseEncodedString(i, lim, charBuf, loadMoreOrError(pos)) + } else parseEncodedString(i, lim, charBuf, loadMoreOrError(pos)) + } else parseEncodedString(i, lim, charBuf, loadMoreOrError(pos)) + } else if ((b1 >> 5) == -2) { // 110bbbbb 10aaaaaa (UTF-8 bytes) -> 00000bbbbbaaaaaa (UTF-16 char) + if (remaining > 1) { + val b2 = buf(pos + 1) + if ((b1 & 0x1E) == 0 || (b2 & 0xC0) != 0x80) malformedBytesError(b1, b2, pos) + charBuf(i) = (b1 << 6 ^ b2 ^ 0xF80).toChar // 0xF80 == 0xC0.toByte << 6 ^ 0x80.toByte + parseEncodedString(i + 1, lim, charBuf, pos + 2) + } else parseEncodedString(i, lim, charBuf, loadMoreOrError(pos)) + } else if ((b1 >> 4) == -2) { // 1110cccc 10bbbbbb 10aaaaaa (UTF-8 bytes) -> ccccbbbbbbaaaaaa (UTF-16 char) + if (remaining > 2) { + val b2 = buf(pos + 1) + val b3 = buf(pos + 2) + val ch = (b1 << 12 ^ b2 << 6 ^ b3 ^ 0xFFFE1F80).toChar // 0xFFFE1F80 == 0xE0.toByte << 12 ^ 0x80.toByte << 6 ^ 0x80.toByte + if ((b1 == -32 && (b2 & 0xE0) == 0x80) || (b2 & 0xC0) != 0x80 || (b3 & 0xC0) != 0x80 || + (ch >= 0xD800 && ch <= 0xDFFF)) malformedBytesError(b1, b2, b3, pos) + charBuf(i) = ch + parseEncodedString(i + 1, lim, charBuf, pos + 3) + } else parseEncodedString(i, lim, charBuf, loadMoreOrError(pos)) + } else if ((b1 >> 3) == -2) { // 11110ddd 10ddcccc 10bbbbbb 10aaaaaa (UTF-8 bytes) -> 110110uuuuccccbb 110111bbbbaaaaaa (UTF-16 chars), where uuuu = ddddd - 1 + if (remaining > 3) { + val b2 = buf(pos + 1) + val b3 = buf(pos + 2) + val b4 = buf(pos + 3) + val cp = b1 << 18 ^ b2 << 12 ^ b3 << 6 ^ b4 ^ 0x381F80 // 0x381F80 == 0xF0.toByte << 18 ^ 0x80.toByte << 12 ^ 0x80.toByte << 6 ^ 0x80.toByte + if ((b2 & 0xC0) != 0x80 || (b3 & 0xC0) != 0x80 || (b4 & 0xC0) != 0x80 || + cp < 0x10000 || cp > 0x10FFFF) malformedBytesError(b1, b2, b3, b4, pos) + charBuf(i) = ((cp >>> 10) + 0xD7C0).toChar // 0xD7C0 == 0xD800 - (0x10000 >>> 10) + charBuf(i + 1) = ((cp & 0x3FF) + 0xDC00).toChar + parseEncodedString(i + 2, lim, charBuf, pos + 4) + } else parseEncodedString(i, lim, charBuf, loadMoreOrError(pos)) + } else malformedBytesError(b1, pos) + } else parseEncodedString(i, lim, charBuf, loadMoreOrError(pos)) + } else parseEncodedString(i, growCharBuf(i + 2) - 1, this.charBuf, pos) // 2 is length of surrogate pair + } + + private[this] def malformedBytesError(b1: Byte, pos: Int): Nothing = { + var i = appendString("malformed byte(s): 0x", 0) + i = appendHexByte(b1, i, hexDigits) + decodeError(i, pos, null) + } + + private[this] def malformedBytesError(b1: Byte, b2: Byte, pos: Int): Nothing = { + val ds = hexDigits + var i = appendString("malformed byte(s): 0x", 0) + i = appendHexByte(b1, i, ds) + i = appendString(", 0x", i) + i = appendHexByte(b2, i, ds) + decodeError(i, pos + 1, null) + } + + private[this] def malformedBytesError(b1: Byte, b2: Byte, b3: Byte, pos: Int): Nothing = { + val ds = hexDigits + var i = appendString("malformed byte(s): 0x", 0) + i = appendHexByte(b1, i, ds) + i = appendString(", 0x", i) + i = appendHexByte(b2, i, ds) + i = appendString(", 0x", i) + i = appendHexByte(b3, i, ds) + decodeError(i, pos + 2, null) + } + + private[this] def malformedBytesError(b1: Byte, b2: Byte, b3: Byte, b4: Byte, pos: Int): Nothing = { + val ds = hexDigits + var i = appendString("malformed byte(s): 0x", 0) + i = appendHexByte(b1, i, ds) + i = appendString(", 0x", i) + i = appendHexByte(b2, i, ds) + i = appendString(", 0x", i) + i = appendHexByte(b3, i, ds) + i = appendString(", 0x", i) + i = appendHexByte(b4, i, ds) + decodeError(i, pos + 3, null) + } + + private[this] def appendHexByte(b: Byte, i: Int, ds: Array[Char]): Int = { + ensureCharBufCapacity(i + 2) + charBuf(i) = ds(b >> 4 & 0xF) + charBuf(i + 1) = ds(b & 0xF) + i + 2 + } + + private[this] def appendString(s: String, i: Int): Int = { + val len = s.length + val required = i + len + ensureCharBufCapacity(required) + s.getChars(0, len, charBuf, i) + required + } + + private[this] def ensureCharBufCapacity(required: Int): Unit = + if (charBuf.length < required) growCharBuf(required): Unit + + private final val hexDigits: Array[Char] = + Array('0', '1', '2', '3', '4', '5', '6', '7', '8', '9', 'a', 'b', 'c', 'd', 'e', 'f') + + private[this] def decodeError(msg: String, pos: Int, cause: Throwable = null): Nothing = + decodeError(appendString(msg, 0), pos, cause) + + private[this] def decodeError(from: Int, pos: Int, cause: Throwable): Nothing = { + var i = appendString(", offset: 0x", from) + val offset = + if ((bbuf eq null) && (in eq null)) 0 + else totalRead - tail + i = appendHexOffset(offset + pos, i) + // if (config.appendHexDumpToParseException) { + // i = appendString(", buf:", i) + // i = appendHexDump(pos, offset.toInt, i) + // } + throw new JsonReaderException(new String(charBuf, 0, i), cause, true) //config.throwReaderExceptionWithStackTrace) + } + + private[this] def appendHexOffset(d: Long, i: Int): Int = { + ensureCharBufCapacity(i + 16) + val ds = hexDigits + var j = i + val dl = d.toInt + if (dl != d) { + val dh = (d >> 32).toInt + var shift = 32 - java.lang.Integer.numberOfLeadingZeros(dh) & 0x1C + while (shift >= 0) { + charBuf(j) = ds(dh >> shift & 0xF) + shift -= 4 + j += 1 + } + } + putHexInt(dl, j, charBuf, ds) + j + 8 + } + + private[this] def putHexInt(d: Int, i: Int, charBuf: Array[Char], ds: Array[Char]): Unit = { + charBuf(i) = ds(d >>> 28) + charBuf(i + 1) = ds(d >> 24 & 0xF) + charBuf(i + 2) = ds(d >> 20 & 0xF) + charBuf(i + 3) = ds(d >> 16 & 0xF) + charBuf(i + 4) = ds(d >> 12 & 0xF) + charBuf(i + 5) = ds(d >> 8 & 0xF) + charBuf(i + 6) = ds(d >> 4 & 0xF) + charBuf(i + 7) = ds(d & 0xF) + } + + private[this] def loadMoreOrError(pos: Int): Int = { + if ((bbuf eq null) && (in eq null)) endOfInputError() + loadMore(pos, throwOnEndOfInput = true) + } + + private[this] def loadMore(pos: Int): Int = + if ((bbuf eq null) && (in eq null)) pos + else loadMore(pos, throwOnEndOfInput = false) + + private[this] def loadMore(pos: Int, throwOnEndOfInput: Boolean): Int = { + var newPos = pos + val offset = + if (mark < 0) pos + else mark + if (offset > 0) { + newPos -= offset + val buf = this.buf + val remaining = tail - offset + var i = 0 + while (i < remaining) { + buf(i) = buf(i + offset) + i += 1 + } + if (mark > 0) mark = 0 + tail = remaining + head = newPos + } else growBuf() + var len = buf.length - tail + if (bbuf ne null) { + len = Math.min(bbuf.remaining, len) + bbuf.get(buf, tail, len) + } else len = Math.max(in.read(buf, tail, len), 0) + if (throwOnEndOfInput && len == 0) endOfInputError() + tail += len + totalRead += len + newPos + } + + private[json] def endOfInputOrError(): Unit = + if (skipWhitespaces()) decodeError("expected end of input", head) + + private[this] def endOfInputError(): Nothing = decodeError("unexpected end of input", tail) + private[this] def illegalEscapeSequenceError(pos: Int): Nothing = decodeError("illegal escape sequence", pos) + private[this] def unescapedControlCharacterError(pos: Int): Nothing = decodeError("unescaped control character", pos) + @tailrec + private[this] def hexDigitError(pos: Int): Nothing = { + if (nibbles(buf(pos) & 0xFF) < 0) decodeError("expected hex digit", pos) + hexDigitError(pos + 1) + } + private[this] def tokenError(t: Byte, pos: Int = head - 1): Nothing = { + var i = appendString("expected '", 0) + i = appendChar(t.toChar, i) + i = appendChar('\'', i) + decodeError(i, pos, null) + } + private[this] def tokenOrNullError(t: Byte, bs: Int, pos: Int): Nothing = tokenOrNullError(t, { + val b0 = bs.toByte + val b1 = (bs >> 8).toByte + val b2 = (bs >> 16).toByte + pos + + (if (b0 != 'n') -1 + else if (b1 != 'u') 0 + else if (b2 != 'l') 1 + else 2) + }) + private[this] def tokenOrNullError(t: Byte, pos: Int = head - 1): Nothing = { + var i = appendString("expected '", 0) + i = appendChar(t.toChar, i) + i = appendString("' or null", i) + decodeError(i, pos, null) + } + private[this] def illegalTokenOperation(): Nothing = + throw new IllegalStateException("expected preceding call of 'nextToken()' or 'isNextToken()'") + + private[this] def appendChar(ch: Char, i: Int): Int = { + ensureCharBufCapacity(i + 1) + charBuf(i) = ch + i + 1 + } + + private[json] def skipWhitespaces(): Boolean = { + var pos = head + var buf = this.buf + while ((pos < tail || { + pos = loadMore(pos) + buf = this.buf + pos < tail + }) && { + val b = buf(pos) + b == ' ' || b == '\n' || (b | 0x4) == '\r' + }) pos += 1 + head = pos + pos != tail + } + + private[this] def growBuf(): Unit = { + var bufLen = buf.length + // val maxBufSize = config.maxBufSize + // if (bufLen == maxBufSize) tooLongInputError() + bufLen <<= 1 + // if (bufLen > maxBufSize || bufLen < 0) bufLen = maxBufSize + buf = java.util.Arrays.copyOf(buf, bufLen) + } + + private[this] def growCharBuf(required: Int): Int = { + var charBufLen = charBuf.length + // val maxCharBufSize = config.maxCharBufSize + // if (charBufLen == maxCharBufSize) tooLongStringError() + charBufLen = (-1 >>> Integer.numberOfLeadingZeros(charBufLen | required)) + 1 + // if (charBufLen > maxCharBufSize || charBufLen < 0) charBufLen = maxCharBufSize + charBuf = java.util.Arrays.copyOf(charBuf, charBufLen) + charBufLen + } + + private[this] def readEscapedUnicode(pos: Int, buf: Array[Byte]): Char = { + val ns = nibbles + val x = + ns(buf(pos) & 0xFF) << 12 | + ns(buf(pos + 1) & 0xFF) << 8 | + ns(buf(pos + 2) & 0xFF) << 4 | + ns(buf(pos + 3) & 0xFF) + if (x < 0) hexDigitError(pos) + x.toChar + } + + private final val nibbles: Array[Byte] = Array( + -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, + -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, + -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, + 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, -1, -1, -1, -1, -1, -1, + -1, 10, 11, 12, 13, 14, 15, -1, -1, -1, -1, -1, -1, -1, -1, -1, + -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, + -1, 10, 11, 12, 13, 14, 15, -1, -1, -1, -1, -1, -1, -1, -1, -1, + -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, + -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, + -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, + -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, + -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, + -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, + -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, + -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, + -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1 + ) \ No newline at end of file diff --git a/src/main/scala/co.blocke.scalajack/json/exp/JsonReaderException.scala b/src/main/scala/co.blocke.scalajack/json/exp/JsonReaderException.scala new file mode 100644 index 00000000..a910deb0 --- /dev/null +++ b/src/main/scala/co.blocke.scalajack/json/exp/JsonReaderException.scala @@ -0,0 +1,6 @@ +package co.blocke.scalajack +package json +package exp + +class JsonReaderException private[json](msg: String, cause: Throwable, withStackTrace: Boolean) + extends RuntimeException(msg, cause, true, withStackTrace) \ No newline at end of file diff --git a/src/main/scala/co.blocke.scalajack/json/exp/package.scala b/src/main/scala/co.blocke.scalajack/json/exp/package.scala new file mode 100644 index 00000000..9717f4f1 --- /dev/null +++ b/src/main/scala/co.blocke.scalajack/json/exp/package.scala @@ -0,0 +1,15 @@ +package co.blocke.scalajack +package json + +import java.nio.ByteBuffer +import scala.{specialized => sp} + +package object exp { + + private[this] final val readerPool: ThreadLocal[JsonReader] = new ThreadLocal[JsonReader] { + override def initialValue(): JsonReader = new JsonReader + } + + def readFromString(s: String): String = + readerPool.get.read(s) +} diff --git a/src/main/scala/co.blocke.scalajack/json/reading/ClassDecoder.scala b/src/main/scala/co.blocke.scalajack/json/reading/ClassDecoder.scala index ff12cf56..00f10666 100644 --- a/src/main/scala/co.blocke.scalajack/json/reading/ClassDecoder.scala +++ b/src/main/scala/co.blocke.scalajack/json/reading/ClassDecoder.scala @@ -31,3 +31,18 @@ object ClassDecoder: // Construct the new object instantiator(fieldValues) } + + /* + +ClassDecoder.apply[Friend]( + Array[String]("name", "age", "email"), + List[JsonDecoder[_]]( + JsonDecoder.string, + JsonDecoder.int, + JsonDecoder.string + ).toArray, + ((fieldValues: scala.Array[_]) => new Friend(fieldValues.apply(0).asInstanceOf[java.lang.String], fieldValues.apply(1).asInstanceOf[scala.Int], fieldValues.apply(2).asInstanceOf[java.lang.String])), + scala.List.apply[scala.Any](0, 0, 0).toArray[scala.Any](scala.reflect.ClassTag.Any) + ) + + */ diff --git a/src/main/scala/co.blocke.scalajack/json/reading/JsonReader.scala b/src/main/scala/co.blocke.scalajack/json/reading/JsonReader.scalax similarity index 90% rename from src/main/scala/co.blocke.scalajack/json/reading/JsonReader.scala rename to src/main/scala/co.blocke.scalajack/json/reading/JsonReader.scalax index a8a12267..a3410384 100644 --- a/src/main/scala/co.blocke.scalajack/json/reading/JsonReader.scala +++ b/src/main/scala/co.blocke.scalajack/json/reading/JsonReader.scalax @@ -48,7 +48,7 @@ object JsonReader: case '[t] => r.elementRef.refType match case '[e] => - val elemDecoder = Expr.summon[JsonDecoder[e]].getOrElse(refRead[e](r.elementRef.asInstanceOf[RTypeRef[e]])) + val elemDecoder = Expr.summon[JsonDecoder[e]].getOrElse(refRead2[e](r.elementRef.asInstanceOf[RTypeRef[e]])) '{ JsonDecoder.seq[e]($elemDecoder) map (_.to(${ Expr.summon[Factory[e, T]].get })) } @@ -59,7 +59,7 @@ object JsonReader: r.fields.map(f => f.fieldRef.refType match case '[e] => - Expr.summon[JsonDecoder[e]].getOrElse(refRead[e](f.fieldRef.asInstanceOf[RTypeRef[e]])) + Expr.summon[JsonDecoder[e]].getOrElse(refRead2[e](f.fieldRef.asInstanceOf[RTypeRef[e]])) ) ) val instantiator = JsonReaderUtil.classInstantiator[T](r.asInstanceOf[ClassRef[T]]) @@ -88,4 +88,7 @@ object JsonReader: else Expr(null.asInstanceOf[Int]) }) - '{ ClassDecoder[T]($fieldNames, $fieldDecoders.toArray, $instantiator, $preloaded.toArray) } + val x = '{ ClassDecoder[T]($fieldNames, $fieldDecoders.toArray, $instantiator, $preloaded.toArray) } + println(s"Class Decoder: ${x.show}") + println("---------------------") + x diff --git a/src/main/scala/co.blocke.scalajack/json/reading/JsonReader.scalax2 b/src/main/scala/co.blocke.scalajack/json/reading/JsonReader.scalax2 new file mode 100644 index 00000000..e72bc0b1 --- /dev/null +++ b/src/main/scala/co.blocke.scalajack/json/reading/JsonReader.scalax2 @@ -0,0 +1,114 @@ +package co.blocke.scalajack +package json +package reading + +import scala.annotation.* + +object JsonReader: + protected val ull: Array[Char] = "ull".toCharArray + protected val alse: Array[Char] = "alse".toCharArray + protected val rue: Array[Char] = "rue".toCharArray + +case class JsonReader(src: JsonSource): + + private var expectFieldValue = false + + // returns false if 'null' found + def expectObjectStart(): Boolean = + src.readSkipWhitespace() match { + case '{' => true + case 'n' => + readChars(JsonReader.ull, "null") + false + case c => throw new JsonParseError(s"Expected object start '{' but found '$c'", src) + } + + def expectArrayStart(): Boolean = + src.readSkipWhitespace() match { + case '[' => true + case 'n' => + readChars(JsonReader.ull, "null") + false + case c => throw new JsonParseError(s"Expected array start '[' but found '$c'", src) + } + + // True if we got a comma, and False for ] + def nextArrayElement(): Boolean = + (src.readSkipWhitespace(): @switch) match + case ',' => true + case ']' => false + case c => throw JsonParseError(s"Expected ',' or ']' got '$c'", src) + + // True if we got a comma, and False for } + def nextField(): Boolean = + (src.readSkipWhitespace(): @switch) match { + case ',' => + expectFieldValue = false + true + case '}' if !expectFieldValue => false + case '}' => + throw JsonParseError("Expected field value but got '}' instead.", src) + case c => + throw JsonParseError(s"expected ',' or '}' got '$c'", src) + } + + inline def expectFieldName(): CharSequence = + val charseq = expectString() + expectFieldValue = true + charseq + + // need flavor of expectString that might be null for field values + + private def expectString(): CharSequence = + charWithWS(src, '"') + val sb = new FastStringBuilder(64) + while true do + val c = src.readEscapedString() + if c == END_OF_STRING then return sb.buffer // mutable thing escapes, but cannot be changed + sb.append(c.toChar) + throw JsonParseError("Invalid string value detected", src) + + inline def expectChar(): Char = + expectString() match { + case s if s.length == 1 => s.charAt(0) + case s => throw new JsonParseError(s"Expected a Char value but got '$s'", src) + } + + def expectBoolean(): Boolean = + (src.readSkipWhitespace(): @switch) match + case 't' => + readChars(JsonReader.rue, "true") + true + case 'f' => + readChars(JsonReader.alse, "false") + false + case c => throw JsonParseError(s"Expected 'true' or 'false' got '$c'", src) + + def expectInt(): Int = + checkNumber() + try { + val i = UnsafeNumbers.int_(src, false) + src.retract() + i + } catch { + case UnsafeNumbers.UnsafeNumber => throw JsonParseError("Expected an Int", src) + } + + private inline def readChars( + expect: Array[Char], + errMsg: String + ): Unit = + var i: Int = 0 + while i < expect.length do + if src.read() != expect(i) then throw JsonParseError(s"Expected $errMsg", src) + i += 1 + + private def checkNumber(): Unit = + (src.readSkipWhitespace(): @switch) match + case '-' | '0' | '1' | '2' | '3' | '4' | '5' | '6' | '7' | '8' | '9' | '.' => () + case c => throw JsonParseError(s"Expected a number, got $c", src) + src.retract() + + inline def charWithWS(in: JsonSource, c: Char): Unit = + val got = in.readSkipWhitespace() + if got != c then throw JsonParseError(s"Expected '$c' got '$got'", src) diff --git a/src/main/scala/co.blocke.scalajack/json/reading/JsonSource.scala b/src/main/scala/co.blocke.scalajack/json/reading/JsonSource.scala index fb90eac4..562aa20a 100644 --- a/src/main/scala/co.blocke.scalajack/json/reading/JsonSource.scala +++ b/src/main/scala/co.blocke.scalajack/json/reading/JsonSource.scala @@ -4,18 +4,27 @@ package reading import scala.annotation.* +object JsonSource: + protected val ull: Array[Char] = "ull".toCharArray + protected val alse: Array[Char] = "alse".toCharArray + protected val rue: Array[Char] = "rue".toCharArray + // ZIO-Json defines a series of different Readers. Not exactly sure why--maybe to support different // modes (streaming, ...)? At least for now we only need one, so merged key bits of Readers into one. case class JsonSource(js: CharSequence): private var i = 0 + private var expectFieldValue = false private[json] val max = js.length def pos = i + def here = js.charAt(i) + inline def read(): Char = if i < max then + val c = history(i) i += 1 - history(i - 1) + c else BUFFER_EXCEEDED inline def readSkipWhitespace(): Char = @@ -68,3 +77,120 @@ case class JsonSource(js: CharSequence): accum = accum * 16 + c i += 1 accum.toChar + +//------- + + // returns false if 'null' found + def expectObjectStart(): Boolean = + readSkipWhitespace() match { + case '{' => + true + case 'n' => + readChars(JsonSource.ull, "null") + false + case c => throw new JsonParseError(s"Expected object start '{' but found '$c'", this) + } + + def expectArrayStart(): Boolean = + readSkipWhitespace() match { + case '[' => + true + case 'n' => + readChars(JsonSource.ull, "null") + false + case c => throw new JsonParseError(s"Expected array start '[' but found '$c'", this) + } + + // True if we got a comma, and False for ] + def nextArrayElement(): Boolean = + (readSkipWhitespace(): @switch) match + case ',' => + true + case ']' => + false + case c => throw JsonParseError(s"Expected ',' or ']' got '$c'", this) + + // True if we got a comma, and False for } + def nextField(): Boolean = + (readSkipWhitespace(): @switch) match { + case ',' => + expectFieldValue = false + true + case '}' if !expectFieldValue => + false + case '}' => + throw JsonParseError("Expected field value but got '}' instead.", this) + case c => + throw JsonParseError(s"expected ',' or '}' got '$c'", this) + } + + inline def expectFieldName(): CharSequence = + val charseq = parseString() + expectFieldValue = true + charseq + + // Value might be null! + def expectString(): CharSequence = + readSkipWhitespace() match { + case '"' => + retract() + parseString() + case 'n' => + readChars(JsonSource.ull, "null") + null + case c => throw new JsonParseError(s"Expected a String value but got '$c'", this) + } + + private def parseString(): CharSequence = + charWithWS('"') + val sb = new FastStringBuilder(64) + while true do + val c = readEscapedString() + if c == END_OF_STRING then return sb.buffer // mutable thing escapes, but cannot be changed + sb.append(c.toChar) + throw JsonParseError("Invalid string value detected", this) + + inline def expectChar(): Char = + expectString() match { + case s if s.length == 1 => s.charAt(0) + case s => throw new JsonParseError(s"Expected a Char value but got '$s'", this) + } + + def expectBoolean(): Boolean = + (readSkipWhitespace(): @switch) match + case 't' => + readChars(JsonSource.rue, "true") + true + case 'f' => + readChars(JsonSource.alse, "false") + false + case c => throw JsonParseError(s"Expected 'true' or 'false' got '$c'", this) + + def expectInt(): Int = + checkNumber() + try { + val i = UnsafeNumbers.int_(this, false) + retract() + i + } catch { + case UnsafeNumbers.UnsafeNumber => throw JsonParseError("Expected an Int", this) + } + + private inline def readChars( + expect: Array[Char], + errMsg: String + ): Unit = + var i: Int = 0 + while i < expect.length do + if read() != expect(i) then throw JsonParseError(s"Expected $errMsg", this) + i += 1 + + private def checkNumber(): Unit = + (readSkipWhitespace(): @switch) match + case '-' | '0' | '1' | '2' | '3' | '4' | '5' | '6' | '7' | '8' | '9' | '.' => () + case c => throw JsonParseError(s"Expected a number, got $c", this) + retract() + + inline def charWithWS(c: Char): Unit = + val got = readSkipWhitespace() + if got != c then throw JsonParseError(s"Expected '$c' got '$got'", this) diff --git a/src/main/scala/co.blocke.scalajack/json/writing/JsonCodecMaker.scala b/src/main/scala/co.blocke.scalajack/json/writing/JsonCodecMaker.scala index 7385b467..d6ea82b5 100644 --- a/src/main/scala/co.blocke.scalajack/json/writing/JsonCodecMaker.scala +++ b/src/main/scala/co.blocke.scalajack/json/writing/JsonCodecMaker.scala @@ -6,8 +6,10 @@ import co.blocke.scala_reflection.{RTypeRef, TypedName} import co.blocke.scala_reflection.reflect.ReflectOnType import co.blocke.scala_reflection.reflect.rtypeRefs.* import co.blocke.scala_reflection.rtypes.{EnumRType, JavaClassRType, NonConstructorFieldInfo} +import reading.JsonSource import scala.jdk.CollectionConverters.* import scala.quoted.* +import scala.collection.Factory import dotty.tools.dotc.ast.Trees.EmptyTree import org.apache.commons.text.StringEscapeUtils import org.apache.commons.lang3.text.translate.CharSequenceTranslator @@ -19,25 +21,26 @@ object JsonCodecMaker: // Cache generated method Symbols + an array of the generated functions (DefDef) case class MethodKey(ref: RTypeRef[?], isStringified: Boolean) // <-- TODO: Not clear what isStringified does here... - val methodSyms = new scala.collection.mutable.HashMap[MethodKey, Symbol] - val methodDefs = new scala.collection.mutable.ArrayBuffer[DefDef] + + val writeMethodSyms = new scala.collection.mutable.HashMap[MethodKey, Symbol] + val writeMethodDefs = new scala.collection.mutable.ArrayBuffer[DefDef] // Fantastic Dark Magic here--lifted from Jasoniter. Props! This thing will create a DefDef, and a Symbol to it. // The Symbol will let you call the generated function later from other macro-generated code. The goal is to use // generated functions to create cleaner/faster macro code than what straight quotes/splices would create unaided. - def makeFn[U: Type](methodKey: MethodKey, arg: Expr[U], out: Expr[JsonOutput])(f: (Expr[U], Expr[JsonOutput]) => Expr[Unit]): Expr[Unit] = + def makeWriteFn[U: Type](methodKey: MethodKey, arg: Expr[U], out: Expr[JsonOutput])(f: (Expr[U], Expr[JsonOutput]) => Expr[Unit]): Expr[Unit] = // Get a symbol, if one already created for this key... else make one. Apply( Ref( - methodSyms.getOrElse( + writeMethodSyms.getOrElse( methodKey, { val sym = Symbol.newMethod( Symbol.spliceOwner, - "w" + methodSyms.size, // 'w' is for Writer! + "w" + writeMethodSyms.size, // 'w' is for Writer! MethodType(List("in", "out"))(_ => List(TypeRepr.of[U], TypeRepr.of[JsonOutput]), _ => TypeRepr.of[Unit]) ) - methodSyms.update(methodKey, sym) - methodDefs += DefDef( + writeMethodSyms.update(methodKey, sym) + writeMethodDefs += DefDef( sym, params => { val List(List(in, out)) = params @@ -51,6 +54,39 @@ object JsonCodecMaker: List(arg.asTerm, out.asTerm) ).asExprOf[Unit] + val readMethodSyms = new scala.collection.mutable.HashMap[MethodKey, Symbol] + val readMethodDefs = new scala.collection.mutable.ArrayBuffer[DefDef] + + class JsonReader // temporary to compile until we write the real JsonReader + + def makeReadFn[U: Type](methodKey: MethodKey, arg: Expr[U], in: Expr[JsonReader])(f: (Expr[JsonReader], Expr[U]) => Expr[U])(using Quotes)(using Type[JsonReader]): Expr[U] = + methodKey.ref.refType match + case '[tt] => + val typerepr = TypeRepr.of[tt] + Apply( + Ref( + readMethodSyms.getOrElse( + methodKey, { + val sym = Symbol.newMethod( + Symbol.spliceOwner, + "r" + readMethodSyms.size, + MethodType(List("in", "default"))(_ => List(TypeRepr.of[JsonReader], typerepr), _ => TypeRepr.of[U]) + ) + readMethodSyms.update(methodKey, sym) + readMethodDefs += DefDef( + sym, + params => { + val List(List(in, default)) = params + Some(f(in.asExprOf[JsonReader], default.asExprOf[U]).asTerm.changeOwner(sym)) + } + ) + sym + } + ) + ), + List(in.asTerm, arg.asTerm) + ).asExprOf[U] + // --------------------------------------------------------------------------------------------- def maybeWrite[T](label: String, aE: Expr[T], ref: RTypeRef[T], out: Expr[JsonOutput], cfg: JsonConfig): Expr[Unit] = @@ -205,12 +241,12 @@ object JsonCodecMaker: // --------------------------------------------------------------------------------------------- - def genFnBody[T](r: RTypeRef[?], aE: Expr[T], out: Expr[JsonOutput], emitDiscriminator: Boolean = false, inTuple: Boolean = false)(using Quotes): Expr[Unit] = + def genEncFnBody[T](r: RTypeRef[?], aE: Expr[T], out: Expr[JsonOutput], emitDiscriminator: Boolean = false, inTuple: Boolean = false)(using Quotes): Expr[Unit] = r.refType match case '[b] => r match case t: ArrayRef[?] => - makeFn[b](MethodKey(t, false), aE.asInstanceOf[Expr[b]], out) { (in, out) => + makeWriteFn[b](MethodKey(t, false), aE.asInstanceOf[Expr[b]], out) { (in, out) => t.elementRef.refType match case '[e] => val tin = in.asInstanceOf[Expr[Array[e]]] @@ -226,7 +262,7 @@ object JsonCodecMaker: } case t: SeqRef[?] => - makeFn[b](MethodKey(t, false), aE.asInstanceOf[Expr[b]], out) { (in, out) => + makeWriteFn[b](MethodKey(t, false), aE.asInstanceOf[Expr[b]], out) { (in, out) => t.elementRef.refType match case '[e] => val tin = if t.isMutable then in.asExprOf[scala.collection.mutable.Seq[e]] else in.asExprOf[Seq[e]] @@ -242,7 +278,7 @@ object JsonCodecMaker: } case t: SetRef[?] => - makeFn[b](MethodKey(t, false), aE.asInstanceOf[Expr[b]], out) { (in, out) => + makeWriteFn[b](MethodKey(t, false), aE.asInstanceOf[Expr[b]], out) { (in, out) => t.elementRef.refType match case '[e] => val tin = if t.isMutable then in.asExprOf[scala.collection.mutable.Set[e]] else in.asExprOf[Set[e]] @@ -270,12 +306,12 @@ object JsonCodecMaker: case '[c] => val subtype = TypeIdent(TypeRepr.of[c].typeSymbol) val sym = Symbol.newBind(Symbol.spliceOwner, "t", Flags.EmptyFlags, subtype.tpe) - CaseDef(Bind(sym, Typed(Ref(sym), subtype)), None, genFnBody[c](child, Ref(sym).asExprOf[c], out, true).asTerm) + CaseDef(Bind(sym, Typed(Ref(sym), subtype)), None, genEncFnBody[c](child, Ref(sym).asExprOf[c], out, true).asTerm) } :+ CaseDef(Literal(NullConstant()), None, '{ $out.burpNull() }.asTerm) val matchExpr = Match(aE.asTerm, cases).asExprOf[Unit] matchExpr - // We don't use makeFn here because a value class is basically just a "box" around a simple type + // We don't use makeWriteFn here because a value class is basically just a "box" around a simple type case t: ScalaClassRef[?] if t.isValueClass => val theField = t.fields.head.fieldRef theField.refType match @@ -284,7 +320,7 @@ object JsonCodecMaker: genWriteVal(fieldValue, theField.asInstanceOf[RTypeRef[e]], out) case t: ScalaClassRef[?] => - makeFn[b](MethodKey(t, false), aE.asInstanceOf[Expr[b]], out) { (in, out) => + makeWriteFn[b](MethodKey(t, false), aE.asInstanceOf[Expr[b]], out) { (in, out) => val body = { val eachField = t.fields.map { f => f.fieldRef.refType match @@ -339,7 +375,7 @@ object JsonCodecMaker: } case t: MapRef[?] => - makeFn[b](MethodKey(t, false), aE.asInstanceOf[Expr[b]], out) { (in, out) => + makeWriteFn[b](MethodKey(t, false), aE.asInstanceOf[Expr[b]], out) { (in, out) => t.elementRef.refType match case '[k] => t.elementRef2.refType match @@ -359,7 +395,7 @@ object JsonCodecMaker: } case t: JavaCollectionRef[?] => - makeFn[b](MethodKey(t, false), aE.asInstanceOf[Expr[b]], out) { (in, out) => + makeWriteFn[b](MethodKey(t, false), aE.asInstanceOf[Expr[b]], out) { (in, out) => t.elementRef.refType match case '[e] => val tin = in.asExprOf[java.util.Collection[_]] @@ -375,7 +411,7 @@ object JsonCodecMaker: } case t: JavaMapRef[?] => - makeFn[b](MethodKey(t, false), aE.asInstanceOf[Expr[b]], out) { (in, out) => + makeWriteFn[b](MethodKey(t, false), aE.asInstanceOf[Expr[b]], out) { (in, out) => t.elementRef.refType match case '[k] => t.elementRef2.refType match @@ -395,7 +431,7 @@ object JsonCodecMaker: } case t: JavaClassRef[?] => - makeFn[b](MethodKey(t, false), aE.asInstanceOf[Expr[b]], out) { (in, out) => + makeWriteFn[b](MethodKey(t, false), aE.asInstanceOf[Expr[b]], out) { (in, out) => t.refType match case '[p] => val rtype = t.expr.asExprOf[JavaClassRType[p]] @@ -443,12 +479,12 @@ object JsonCodecMaker: case '[c] => val subtype = TypeIdent(TypeRepr.of[c].typeSymbol) val sym = Symbol.newBind(Symbol.spliceOwner, "t", Flags.EmptyFlags, subtype.tpe) - CaseDef(Bind(sym, Typed(Ref(sym), subtype)), None, genFnBody[c](child, Ref(sym).asExprOf[c], out, true).asTerm) + CaseDef(Bind(sym, Typed(Ref(sym), subtype)), None, genEncFnBody[c](child, Ref(sym).asExprOf[c], out, true).asTerm) } :+ CaseDef(Literal(NullConstant()), None, '{ $out.burpNull() }.asTerm) val matchExpr = Match(aE.asTerm, cases).asExprOf[Unit] matchExpr - // No makeFn here--Option is just a wrapper to the real thingy + // No makeWriteFn here--Option is just a wrapper to the real thingy case t: OptionRef[?] => t.optionParamType.refType match case '[e] => @@ -467,20 +503,20 @@ object JsonCodecMaker: ${ genWriteVal[e]('{ vv }, t.optionParamType.asInstanceOf[RTypeRef[e]], out) } } - // No makeFn here... SelfRef is referring to something we've already seen before. There absolutely should already be a geneated + // No makeWriteFn here... SelfRef is referring to something we've already seen before. There absolutely should already be a geneated // and cached function for this thing that we can call. case t: SelfRefRef[?] => t.refType match case '[e] => val key = MethodKey(ReflectOnType[e](q)(TypeRepr.of[e])(using scala.collection.mutable.Map.empty[TypedName, Boolean]), false) - val sym = methodSyms(key) + val sym = writeMethodSyms(key) val tin = aE.asExprOf[b] '{ if $tin == null then $out.burpNull() else ${ Ref(sym).appliedTo(tin.asTerm, out.asTerm).asExprOf[Unit] } } - // No makeFn here. All LeftRight types (Either, Union, Intersection) are just type wrappers + // No makeWriteFn here. All LeftRight types (Either, Union, Intersection) are just type wrappers case t: LeftRightRef[?] => val tin = aE.asExprOf[b] t.leftRef.refType match @@ -550,7 +586,7 @@ object JsonCodecMaker: ${ genWriteVal[lt]('{ $tin.asInstanceOf[lt] }, t.leftRef.asInstanceOf[RTypeRef[lt]], out, inTuple = inTuple) } } - // No makeFn here. Try is just a wrapper + // No makeWriteFn here. Try is just a wrapper case t: TryRef[?] => t.tryRef.refType match case '[e] => @@ -573,7 +609,7 @@ object JsonCodecMaker: } case t: TupleRef[?] => - makeFn[b](MethodKey(t, false), aE.asInstanceOf[Expr[b]], out) { (in, out) => + makeWriteFn[b](MethodKey(t, false), aE.asInstanceOf[Expr[b]], out) { (in, out) => '{ if $in == null then $out.burpNull() else @@ -601,14 +637,12 @@ object JsonCodecMaker: def genWriteVal[T: Type]( aE: Expr[T], ref: RTypeRef[T], - // optWriteDiscriminator: Option[WriteDiscriminator], out: Expr[JsonOutput], - // cfgE: Expr[JsonConfig], isStringified: Boolean = false, // e.g. Map key values. Doesn't apply to stringish values, which are always quotes-wrapped inTuple: Boolean = false )(using Quotes): Expr[Unit] = val methodKey = MethodKey(ref, false) - methodSyms + writeMethodSyms .get(methodKey) .map { sym => // hit cache first... then match on Ref type Apply(Ref(sym), List(aE.asTerm, out.asTerm)).asExprOf[Unit] @@ -752,9 +786,80 @@ object JsonCodecMaker: // Everything else... case _ if isStringified => throw new JsonIllegalKeyType("Non-primitive/non-simple types cannot be map keys") - case _ => genFnBody(ref, aE, out, inTuple = inTuple) + case _ => genEncFnBody(ref, aE, out, inTuple = inTuple) ) + // --------------------------------------------------------------------------------------------- + + def genReadVal[T: Type]( + // default: Expr[T], // needed? This should already be in ref... + ref: RTypeRef[T], + in: Expr[JsonSource], + isStringified: Boolean = false, // e.g. Map key values. Doesn't apply to stringish values, which are always quotes-wrapped + inTuple: Boolean = false // not sure if needed... + )(using Quotes): Expr[T] = + val methodKey = MethodKey(ref, false) + // Stuff here to hit readMethodSyms first -- TBD + ref match + // First cover all primitive and simple types... + // case t: BigDecimalRef => + // if isStringified then '{ $out.valueStringified(${ aE.asExprOf[scala.math.BigDecimal] }) } + // else '{ $out.value(${ aE.asExprOf[scala.math.BigDecimal] }) } + // case t: BigIntRef => + // if isStringified then '{ $out.valueStringified(${ aE.asExprOf[scala.math.BigInt] }) } + // else '{ $out.value(${ aE.asExprOf[scala.math.BigInt] }) } + case t: BooleanRef => + '{ $in.expectBoolean() } + // if isStringified then '{ $out.valueStringified(${ aE.asExprOf[Boolean] }) } + // else '{ $out.value(${ aE.asExprOf[Boolean] }) } + // case t: ByteRef => + // if isStringified then '{ $out.valueStringified(${ aE.asExprOf[Byte] }) } + // else '{ $out.value(${ aE.asExprOf[Byte] }) } + // case t: CharRef => '{ $out.value(${ aE.asExprOf[Char] }) } + // case t: DoubleRef => + // if isStringified then '{ $out.valueStringified(${ aE.asExprOf[Double] }) } + // else '{ $out.value(${ aE.asExprOf[Double] }) } + // case t: FloatRef => + // if isStringified then '{ $out.valueStringified(${ aE.asExprOf[Float] }) } + // else '{ $out.value(${ aE.asExprOf[Float] }) } + case t: IntRef => + '{ $in.expectInt() } + // if isStringified then '{ $out.valueStringified(${ aE.asExprOf[Int] }) } + // else '{ $out.value(${ aE.asExprOf[Int] }) } + // case t: LongRef => + // if isStringified then '{ $out.valueStringified(${ aE.asExprOf[Long] }) } + // else '{ $out.value(${ aE.asExprOf[Long] }) } + // case t: ShortRef => + // if isStringified then '{ $out.valueStringified(${ aE.asExprOf[Short] }) } + // else '{ $out.value(${ aE.asExprOf[Short] }) } + case t: StringRef => + '{ Option($in.expectString()).map(_.toString).getOrElse(null) } + // if cfg.escapedStrings then '{ $out.value(StringEscapeUtils.escapeJson(${ aE.asExprOf[String] })) } + // else '{ $out.value(${ aE.asExprOf[String] }) } + + case _ => + ref.refType match + case '[b] => + ref match + case t: SeqRef[?] => + t.elementRef.refType match + case '[e] => + '{ + if ! $in.expectArrayStart() then null.asInstanceOf[T] + else if $in.here == ']' then // empty Seq + $in.read() // skip the ']' + List.empty[e].to(${ Expr.summon[Factory[e, T]].get }) // create appropriate Seq[T] here + else + val acc = scala.collection.mutable.ListBuffer.empty[e] + acc.addOne(${ genReadVal[e](t.elementRef.asInstanceOf[RTypeRef[e]], in) }) + while $in.nextArrayElement() do acc.addOne(${ genReadVal[e](t.elementRef.asInstanceOf[RTypeRef[e]], in) }) + acc.to(${ Expr.summon[Factory[e, T]].get }) // create appropriate Seq[T] here + } + + // case _ => genDecFnBody(ref, in, inTuple = inTuple) + + // --------------------------------------------------------------------------------------------- + // ================================================================ // You've made it this far! Ok, now we sew everything together. // We generate a codec class and then kick off a deep traversal of @@ -771,11 +876,12 @@ object JsonCodecMaker: // } def encodeValue(in: T, out: JsonOutput): Unit = ${ genWriteVal('in, ref, 'out) } + def decodeValue(in: JsonSource): T = ${ genReadVal(ref, 'in) } } }.asTerm val neededDefs = // others here??? Refer to Jsoniter file JsonCodecMaker.scala - methodDefs + writeMethodDefs val codec = Block(neededDefs.toList, codecDef).asExprOf[JsonCodec[T]] // println(s"Codec: ${codec.show}") codec diff --git a/src/main/scala/co.blocke.scalajack/run/Play.scala b/src/main/scala/co.blocke.scalajack/run/Play.scala index ec8aee08..72e942a4 100644 --- a/src/main/scala/co.blocke.scalajack/run/Play.scala +++ b/src/main/scala/co.blocke.scalajack/run/Play.scala @@ -1,65 +1,52 @@ package co.blocke.scalajack +package json package run import co.blocke.scala_reflection.* import scala.jdk.CollectionConverters.* import scala.reflect.ClassTag import json.* +import scala.collection.immutable.Queue -object RunMe extends App: - - // import scala.util.Random - // val random = new Random() - - // def scramble(hash: Int): String = - // val last5 = f"$hash%05d".takeRight(5) - // val digits = (1 to 5).map(_ => random.nextInt(10)) - // if digits(0) % 2 == 0 then s"${last5(0)}${digits(0)}${last5(1)}${digits(1)}${last5(2)}-${digits(2)}${last5(3)}${digits(3)}-${last5(4)}${digits(4)}A" - // else s"${digits(0)}${last5(0)}${digits(1)}${last5(1)}${digits(2)}-${last5(2)}${digits(3)}${last5(3)}-${digits(4)}${last5(4)}B" - - try - - import json.* - import ScalaJack.* - - // val o = json.writing.JsonOutput() - // val a: Any = Foo("Hey", Fish("Bloop", Some(true)), ("ok", Seq(true, false))) - // json.writing.AnyWriter.writeAny(a, o) - // println(o.result) - - val p = Person2(XList(List("x", "y"))) - println(RType.of[Person2].pretty) - val js = sj[Person2].toJson(p) - println(js) +class Shape[T](polygon: T) +class Parallelogram() +class Rectangle() extends Parallelogram - // val inst = Blah("wow", Some(111)) // Some(Some(None))) // Some(Some(3))) - // val js = sj[Blah].toJson(inst) - // println(js) - - // co.blocke.scalajack.internal.CodePrinter.code { - // sj[Record] - // } - - // val v = Foo("Hey", Fish("Bloop", None), None, Color.Blue) - // val v = Foo("Hey", "Boo") - - // println(ScalaJack[Foo].toJson(v)) - // println(sj[Foo](JsonConfig.withTypeHintLabel("bogus")).toJson(v)) - - // println(sj[Record].toJson(record)) - - // println("------") - - // println(sj[Record].fromJson(jsData)) - catch { - case t: Throwable => - println(s"BOOM ($t): " + t.getMessage) - t.printStackTrace - } +object RunMe extends App: - // val s1 = scramble(15) - // val s2 = scramble(394857) - // println(s1) - // println(s2) - // println(descrambleTest(s1, 15)) - // println(descrambleTest(s2, 394857)) + val suite: Shape[Parallelogram] = new Shape[Parallelogram](new Parallelogram()) + + // UPDATE: I have no idea what these two cases actually test! They seem to do different things... + + // val s = "\"This is a test\"" + // val now = System.nanoTime() + // (1 to 1000000).map(_ => exp.readFromString(s)) + // val later = System.nanoTime() + // println("JsonReader: " + (later - now)) + + // println("==============================") + + // val s2 = "This is a test\"" + // val now2 = System.nanoTime() + // (1 to 1000000).map(_ => parseString(reading.JsonSource(s2))) + // val later2 = System.nanoTime() + // println("SJ : " + (later2 - now2)) + + // def parseString(in: reading.JsonSource): CharSequence = + // // charWithWS(in, '"') + // val sb = new reading.FastStringBuilder(64) + // while true do + // val c = in.readEscapedString() + // if c == END_OF_STRING then return sb.buffer // mutable thing escapes, but cannot be changed + // sb.append(c.toChar) + // throw JsonParseError("Invalid string value detected", in) + + import ScalaJack.* + import co.blocke.scalajack.run.Record + println("\n") + implicit val blah: ScalaJack[List[Queue[Int]]] = sj[List[Queue[Int]]] + println(ScalaJack[List[Queue[Int]]].fromJson("[[1,2,3],[4,5,6],[7,8,9]]")) + // implicit val blah: ScalaJack[Record] = sj[Record] + // println(ScalaJack[Record].fromJson(co.blocke.scalajack.run.jsData)) + + println("done.")