New `encoding/json/v2` package (hidden behind `GOEXPERIMENT=jsonv2` flag)! It brings perf improvements and finally allows devs to implement custom marshalers for external types:
> Alternatively, users can implement functions that match MarshalFunc, MarshalToFunc, UnmarshalFunc, or UnmarshalFromFunc to specify the JSON representation for arbitrary types. This provides callers of JSON functionality with control over how any arbitrary type is serialized as JSON.
> We expect the design of encoding/json/v2 to continue to evolve. We encourage developers to try out the new API and provide feedback on the proposal issue.
Does anyone have more knowledge on what this refers to? I thought v2 being in as an experiment means they are happy with it, but "we expect it to evolve" sounds like "we know it's not good yet". Maybe I am understanding it the wrong way though. Just because other experiments were more like "this is new code, please test" and not "this will change".
It means we're not confident the API is stable yet. There might be further changes before the final, non-experimental version, depending on user feedback and further experience with the current proposal.
In the case of encoding/json/v2, enabling GOEXPERIMENT=jsonv2 has two major effects:
1. It flips encoding/json (the original, not /v2) to use the new implementation. This is supposed to be a fully backwards-compatible change, modulo some changes to the text of some errors. We're very interested to hear of any cases of existing programs breaking when the experiment is turned on, because (aside from the aforementioned error text, which you shouldn't be depending on) it likely indicates a bug that needs fixing. This is the "new code, please test" half of the change.
2. It enables the new API (encoding/json/v2, encoding/json/jsontext, some new options in encoding/json). This is the "unstable API, might change in response to feedback" half of the change.
The arena experiment was essentially placed on indefinite hold:
> The proposal to add arenas to the standard library is on indefinite hold due to concerns about API pollution.
I think the parent comment was using arenas as an example that GOEXPERIMENTs don't always move forward (like arenas), or can change while still GOEXPERIMENTs in a way that would normally not be allowed due to backward compatibility (like synctest).
The arena GOEXPERIMENT has not yet been dropped as of Go 1.25, but as I understand it, the plan is to remove arenas from the runtime when 'regions' are introduced, which have similar performance benefits but a much lower API impact:
Yes I was very excited to see the new json encoding changes land, can’t wait to try them out! The new omitempty and map key marshalling in particular will help clean up some of my ugly code.
To be fair, the existing json package in Go's standard library is somewhat infamous because it is non-streaming, so it has performance issues with large documents. one of the goals of json/v2 was to remedy this.
Zero values can, for the most part, be caught with a custom UnmarshalJSON implementation (and the new UnmarshalJSONFrom interface ought to remove most of the performance penalty associated with that). The one problem is what to do when the field is missing entirely, because then UnmarshalJSON(From) will never be invoked.
I have been thinking about suggesting a new struct field tag for JSON parsing: `json:',required'` that throws an error when the struct field is absent in the respective JSON object. The artifice is mostly in how to phrase that proposal in a way that makes it more likely for the Go devs to accept it. If I just come in like "I hate zero values", that may have some truth to it, but it's not going to be conducive to the discussion going the way I want.
That's the joke: you either make a whole new DSL that is then not even really that typesafe, or you just embrace pushing bad values into your system.
Because there is no standard or convention, everybody brings their own "nullable" wrappers between JSON, SQL, and whatnot, that are all half-baked and incompatible.
Look at rust/serde to see what it could have been.
You're right, go has not "taken it over". I meant more that it has moved into that niche since its conception, while being really bad at dealing with JSON correctly.
2. Json is the lingua franca of data for the web, so golang does a lot of json processing
2. The golang stdlib json packages haven't had much attention in the last decade
I was just thinking recently that my biggest pain point with the upstream JSON packages was the fact that you can't add easily add custom marshal / unmarshal code to objects in a package you don't control. I'm actually really excited about this change.
The other reason this is at the top of the list, of course, is that there are no major interesting language features being added. This is a combination of the fact that golang is a pretty mature language, and the slowness of the team to adding new language features; both of which I appreciate.
I just love how this language marches forward. I have so many colleagues that hate many aspects of it but I sit here combining Go, Goa and SQLc writing mountains of code and having a fairly good compiler behind me. I understand what I’m missing out on by not using stricter languages and so often it’s a totally fine trade off.
Go is the only language where I've come back to a nontrivial source code after 10 years of letting it sit and have had zero problems building and running. That alone, for me, more than makes up for its idiosyncrasies.
As a more sysadmin/ops focused guy it really is the killer feature.
Static binaries and a more Java-esque resource profile than say Python are the cherries on top.
Okay, C++ is believable, but can you really build a Java / .NET project that was not touched for 20+ years with no changes to the code or the build process (while also using the latest version of the SDKs)?
I imagine you can _make_ a project compile with some amount of effort (thinking maybe a week at most) but they wouldn't be exactly "unzip the old archive and execute ./build.bat".
Yes, because Ant exists since 2000, Maven exists since 2004, and MSBuild since 2003.
Before it was a common procedure to have central package management, we used to store libraries (jars and dlls), on source control directly in some libs folder.
Afterwards, even with central package management, enterprise software when done right, is not calling the Internet in every build, rahter there are internal repositories that are curated by legal and IT, and only those packages are allowed to be used in projects.
So the tooling is naturally around after 20+ years, no one is doing YOLO project management when playing with customer's money.
As for the "...latest version of the SDKs..", that is moving the goal posts, there is no mention of it on,
> Go is the only language where I've come back to a nontrivial source code after 10 years of letting it sit and have had zero problems building and running. That alone, for me, more than makes up for its idiosyncrasies.
Ant and Maven have existed for a long time, but for me they didn't prevent Java (and other JVM language) projects from suffering significant bitrot in the build process.
For example, I worked on a project that just stopped being able to be built with Maven one day, with no changes to the JVM version, any of the dependencies, or the Maven version itself. After a while I gave up trying to figure it out, because the same project was able to be built with Gradle!
Older Scala projects were a pain in the ass to build because the Typesafe repositories stopped accepting plain HTTP connections, requiring obscure configuration changes to sbt. I've never had to deal with things like that in the world of Go.
> As for the "...latest version of the SDKs..", that is moving the goal posts, there is no mention of it on [...]
I thought it was implied since tooling & library breakages over the years happen and sometimes you can't just get the old SDK to run on the latest Windows / macOS. If the languages and Ant/Maven are backwards compatible to that extent, that's actually pretty good!
I had to deal with moving a .NET Framework 4.7 project to .NET Standard 2.0 and it wasn't effortless (although upgrading to each new .NET release after that has been pretty simple so far). We took a couple of weeks even though we had minimal dependencies since we're careful about that stuff.
This. Maintainability and refactorability are some of the major Go superpowers for me which enables getting into any code base and updating it. These are supported by features like static typing, fast compile times, etc.
Of note, I've found this to be very important with AI generated code, where it's easy to grok and refactor AI code.
In all fairness 10 years ago the deps would have been vendored in. Which side steps a whole set of problems if security, remote api version compat and features are not a major need
Yes, but with all the v2 in stdlib popping up we will get a lot of outdated code and a lot of "I need to know v1 and v2, because I will come across both".
Also, most of this can be automated with `go install golang.org/x/tools/gopls/internal/analysis/modernize/cmd/modernize@latest && modernize -fix ./...`
The difference is that going back to Go code you've written a few years ago, isn't nearly as bad as going back to Perl code you've written a few years ago!
And having written a lot of Common Lisp, Go code is extraordinarily straight forward in a sense where every developer writes in almost the exact same style.
This is not true for Common Lisp (even though it's not as bad as people make it out to be).
I feel the exact same way with C versus C++, even if I was the person to write the C++.
I've gotten used to golang, though it's still not my favourite language to program in by any stretch. One issue I've been having, though, is the documentation.
Documentation for third-party modules in Python is fantastic, almost universally so. In nearly every case of using a third-party library, large or small, there's sufficient documentation to get up and running.
Golang libraries, however, seem to be the opposite. In most cases there's either no documentation whatsoever on how to use things, or, more commonly, there is example code in the readme which is out of date and does not work at all.
The IDE integration with golang is great, and it makes some of this a bit easier, but I also still get a ton of situations where my editor will offer some field or function that looks like what I want (and is what I'm typing to see if it will autocomplete) but once I select it it complains that there's no such field or function. Still haven't figured that out.
So yeah, I dunno. The language is 'great'; it certainly has some extreme strengths and conveniences, like the fact that 'run this function with these arguments in a separate thread' is a language keyword and not some deep dive into subprocess or threading or concurrent.futures; the fact that synchronization functionality is trivially easy to access; Sync.Once feels so extremely obvious for a language where concurrency is king, and so on.
Still, the ecosystem is... a bit of a mess, at the best of times. Good modules are great, all other modules are awful.
Generally gophers just use the standard library as much as possible. There isn't the usual set of "must-have" dependencies, and generally speaking when a gopher tries to solve a problem, the first step isn't to search for a 3rd party library that solves it for them.
Obviously this is a broad generalisation and there are plenty of gophers who swear by using one or more libraries, and there are plenty of gophers who do rely on third-paarty dependencies. But this is still noticeably less prevalent than in many other languages, especially the more popular ones in web dev.
As others have said, it also helps that Go code is easy to read and emphasises simplicity. The code is often more readable than the documentation, for sure. Whether you consider this bad documentation is up to you ;)
I quite frankly will just read the code. Go generally discourages abstractions so any code you jump into is fairly straightforward (compared to a hierarchy of abstract classes, dependency injected implementations, nested pattern matching with destructuring etc etc).
Regarding your IDE issues- I’ve found the new wave of copilot/cursor behavior to be the culprit. Sometimes I just disable it and use the agent if I want it to do something. But it’ll completely fail to suggest an auto complete for a method that absolutely exists.
> Go generally discourages abstractions so any code you jump into is fairly straightforward
This is a really anti-intellectual take. All of software engineering is about building abstractions. Not having abstractions makes the structure less easy to understand because they're made implicit, and forces developers to repeat themselves and use brittle hacks. It's not a way to build robust or maintainable software.
I think the more charitable interpretation is "Go generally discourages metaprogramming." Which I would agree with, and I think positively distinguishes it from most popular languages.
Go mostly only have abstractions that the language designers put into the language. It is (mostly) hostile to users defining their own new abstractions.
A case in point is that arrays and maps (and the 'make' function etc) were always generic, but as a user until fairly recently you couldn't define your own generic data structures and algorithms.
Go discouraging abstracts is sorta just... wrong anyways. Go doesn't discourage building abstractions, it discourages building deep / layered abstractions.
That is a key point in my opinion. A typical stack trace of a Spring (Java) application can easily be 1000 to 2000 lines long. That is not so common in Go, as far as I know (I'm not a Go expert ...).
Not really, it's more like it encourages "wide" abstraction (lots of shallow abstractions) that get pieced together vs heavily nested abstractions that encapsulate other abstractions. It's a very imperative language.
Did you cherry pick that part of the sentence and ignored "(compared to a hierarchy of abstract classes, dependency injected implementations, nested pattern matching with destructuring etc etc)." on purpose or?
Of course, you'll probably retreat and say "Go is better for small projects", but every large project started as a small one, and it's really hard to justify rewriting a project in a new language in a business context.
You don't need a hierarchy of abstract classes, dependency injected implementations, nested pattern matching with destructuring, etc for any project. If one decides to implement these techniques in an ad-hoc basis in Go to solve problems, that's more to do with trying to apply principles and techniques from other languages in Go.
Really, there is nothing in the language that prevents you from creating crazy AbstractFactoryFactories or doing DI. What really prevents this is the community. In enterprise C# / Java, insanity is essentially mandated.
I enjoy the Go ecosystem quite a bit and haven't found many issues with documentation. I love how open source modules are documented on pkg.go.dev, including those from major providers, like AWS, Google, etc. Every library has the same references. When examples are useful, such as with charting modules, I've found that the projects do provide them. On the occasion where the README.md code is out of date, it's been easy for me to check pkg.go.dev and update it myself.
> my editor will offer some field or function that looks like what I want (and is what I'm typing to see if it will autocomplete) but once I select it it complains that there's no such field or function
Generally I found updated example in one of the test files. Or I could understand how to use library by reading test files in the repo. For me it's the opposite problem, python documentation is too long in some cases and it's not intuitive to find what I want if it's not trivial, and had to use websearch or llm.
Python package documentation is abysmal. It tends to read like a novel and yet still only covers surface layer details with simplistic examples. It's next to impossible to just "get an overview" of what's available: just show me the modules, classes, functions, etc. Don't make me spend 30 minutes trying to find an explanation for that one function which just takes a kwargs, which ends up only being covered in thr footnote of some random page in the documentation on something otherwise completely unrelated.
I wrote a lot of Java in a past life, and the documentation situation is night and day, for sure. I think it's partly a syntax/tooling issue, and partly a cultural thing. Luckily Go's standard library (+ `/x/` modules) lets me avoid third-party dependencies in many cases. The documentation from the Go team is very good in my opinion.
This is so true and unfortunate because golang has an inbuilt example function that closely follows the test functions. It means that all that really needs to change is how godoc promotes or badges libraries with examples.
I did not like it at first but it has grown on me. I still have my gripes, which are mostly things that come from its overall architecture and will never be resolved, but it is pretty enjoyable to use for the limited domain I use it in at work.
Just watch as most libraries now update their go.mod to say 1.25, despite using no 1.25 features, meaning those who want to continue on 1.24 (which will still have patch releases for six months...) are forced to remain on older versions or jump through lots of hoops.
This is a common issue with Rust projects as well. At least with Rust you have the idea of "MSRV" (minimum supported rust version). I've never heard it discussed within Go's community.
There's no MSGV. Everyone pins the latest.
This also plagues dependencies. People pin to specific version (ie, 1.23) instead of the major version (at least 1.0 or at least 1.2, etc).
The "go x.yy" line in go.mod is supposed to be that MSGV, but `go mod init` will default it to the current version on creation. While you could have tooling like `cargo-msrv` to determine what that value would be optimal, the fact that only the latest two Go versions are supported means it's not particularly useful in most cases.
Now that I think about it more, when I've seen it happen before, it tends to be on projects that use dependabot / renovate. If any of those updates depend (directly or transitively) on a later version of Go, the go.mod would be bumped accordingly for them.
I have a vague feeling it was related to testcontainers or docker, and at the time that job's Go install was always at least 6 months behind. At least with recent Go, it'll switch to a later version that it downloads via the module proxy, that would have helped a lot back then :S
Yay new version! Not the most exciting (as Go releases tend to be which is good), but hopefully jsonv2 and greentea can get some testing and be standard in 1.26
To be fair, I read the article but still don't know what greentea is. The article never directly refers to the new GC by this name. It appears in a command line option value, that's about it.
> LookupMX and Resolver.LookupMX now return DNS names that look like valid IP address, as well as valid domain names. Previously if a name server returned an IP address as a DNS name, LookupMX would discard it, as required by the RFCs. However, name servers in practice do sometimes return IP addresses.
This one is interesting; which servers return an IP address as a record? Why would they want to do this?
> TLS servers now prefer the highest supported protocol version, even if it isn’t the client’s most preferred protocol version.
>Both TLS clients and servers are now stricter in following the specifications and in rejecting off-spec behavior. Connections with compliant peers should be unaffected.
Right, I'm not saying it's impossible in Go, but it's so easy in Lisp you're bound to do it even by accident. There are no "AST utilities" in Lisp because the AST is just a normal list / tree of primitive values (of the `symbol` type). You operate on code structures with the same libraries that you operate on a list / tree of numbers, strings, etc. Code is data.
I gotta admit I never formally learned Lua in any rigorous way, I just picked up enough to script with it in existing codebases. I'll often write Python scripts that manipulate Lua programs, for example.
> LookupMX and Resolver.LookupMX now return DNS names that look like valid IP address, as well as valid domain names. Previously if a name server returned an IP address as a DNS name, LookupMX would discard it, as required by the RFCs. However, name servers in practice do sometimes return IP addresses.
Ah, intentionally making code not standards compliant.
Standards are toilet paper in the general case. Only in the rare cases where reality matches it does it matter.Anyone can write anything on a piece of paper. What code is executing on the DNS server at the end of the day is what matters.
Readability debates are usually boring because it’s so subjective, but in this case it’s just your (admitted!) unfamiliarity. Lots and lots of people would disagree with you that Go is unreadable. Go isn’t pretty or cute, but one of its strengths is its relative clarity. All languages require some familiarity to read properly.
> but in this case it’s just your (admitted!) unfamiliarity
That's exactly the problem. Golang has syntax different from other imperative languages not because its syntax brings something new to the table, but just for the sake of being different. In other words, it's an entry barrier that provides nothing in return.
To illustrate my point, imagine someone coming up with a new measurement unit "my_unit" equal to 0,73926745 cm. The first question is "why" because it solves zero problems for which the metric system would be impractical, while adding new cognitive load for people trying to use it. And then there's the counterargument "you're just not familiar with it!" which is a fair point because objectively, you can't say that either centimeter or "my_unit" is better. It's just that it's unnecessary cost of switching from already applied standard that works equally well.
> but one of its strengths is its relative clarity
How you define "relative clarity" in a way that isn't "so subjective" and not immediately due to familiarity?
To me it seems like you're saying it's all subjective except for Go's relative readability, but I'm not sure what's making said relative readability any less subjective.
It is all subjective, but if a large number of subjects have a consistent opinion then that is something to take seriously rather than dismiss with absurd hyperbole (“most garbage”, “mostly unreadable”).
Admittedly I haven't touched go in around ten years so I'm sure things have changed but I remember being really surprised by how readible it was and how I could jump straight into codebases I had never seen before.
My first problem is "where the fuck is this function defined" and then "what is this type actually". Answering these two questions for random line in random code is surprisingly difficult.
Do you mean that you don't know what type a certain variable has? If so, just add `theVariableName = false` and try to compile it (or just `go vet ./path/to/folder` from the repo root) and the error message will tell you the type. Same as with every other strongly typed language.
> Alternatively, users can implement functions that match MarshalFunc, MarshalToFunc, UnmarshalFunc, or UnmarshalFromFunc to specify the JSON representation for arbitrary types. This provides callers of JSON functionality with control over how any arbitrary type is serialized as JSON.
Awesome stuff.