Ah, blockchains, where people realize all those "it doesn't matter bugs" with 3 layers of indirection due to shaky primitives, actually matter. Other bugs like this could be used to execute arbitrary code on the computer that compiles the software, calling random "important looking" stuff into memory (xz backdoor style) to be decoded and executed by the backdoor. Of course, there will never be a day where you can compile untrusted code in Solidity.
> The lesson? Always test critical software under multiple compilers and library versions — especially when enabling a new language standard.
Don't have giga-complicated language jockey stuff backing software that can't afford to even have one bug.
Thorrez 4 hours ago [-]
I'm not sure how blockchain is relevant here.
The only one that needs to compile Solidity is the person who wrote the Solidity, right? Same with non-blockchain software. So a bug in a Solidity compiler will have the same impact as a bug in any other compiler. At least with regard to executing untrusted code.
mathiaspoint 21 hours ago [-]
This is why I quit using eth. They were just way too comfortable with insane complexity.
11 hours ago [-]
tkz1312 9 hours ago [-]
compiling untrusted code via the wasm builds of solc is quite reasonable and is done at scale by several providers (e.g. etherscan, remix).
vlovich123 1 days ago [-]
All I took away from this is how more and more complicated C++ as a language becomes to make the syntax slightly more convenient.
saghm 1 days ago [-]
I get as exasperated at C++ as anyone else, but IMO there's another takeaway here, which is that smart contracts are an absolutely terrible idea. Relying on code as the source of truth for a transaction just completely disregards the reality that code is always going to be buggy. For those who might not be aware, this is the same smart contract framework where someone accidentally killed $300 million of transactions because a function in a library for setting the wallet associated with a private key was defined as public instead of private: https://medium.com/cybermiles/i-accidentally-killed-it-and-e...
Yes, you can fix issues like this with a "hard fork" if you have a large enough consensus, but at that point, does the system of having smart contracts actually improve anything over the one where the software is downstream rather than the source of truth, or are you just replacing one form of "human intervention required" with a different but worse one?
RealityVoid 1 days ago [-]
As opposed to being subject to interpretation of the truth, with the courts of varying quality. My point is... everything has error bars.
saghm 28 minutes ago [-]
The reliance on context and human intervention in our current system is a feature, not a bug. That's my entire point; if you get rid of that, you're introducing a lot of issues, and if you don't, you're throwing out a system with literal centuries of stress-testing where the flaws are legitimate but well understood to switch to one with literally no evidence to suggest that it's better.
sealeck 18 hours ago [-]
> with the courts of varying quality
You can usually pick the jurisdiction (e.g. many contracts are under jurisdiction of England & Wales/Switzerland/Singapore despite neither party being based there, or doing business there).
I think in general judges are quite considered, and the answers you get from courts tend to make quite a lot of sense, especially if you are aware of general legal principles and precedent.
josefx 13 hours ago [-]
You are still subject to courts if you use smart contracts, they change literally nothing about that.
eightysixfour 22 hours ago [-]
It must. Things that interact with the real world need slack.
usmannk 1 days ago [-]
This is about a language compiler bug. There are no takeaways about smart contracts here.
saghm 1 days ago [-]
It's about a compiler bug in C++ that had downstream effects in the compiler for Solidity, which is a language for developing smart contracts. Yes, every compiler can have bugs, even ones not relating to smart contracts, but that doesn't seem like a very convincing argument that we should be using compilers for more things rather than boring regular code that isn't considered to be contractual.
tsujamin 1 days ago [-]
So long as you’re writing your smart contracts with a chisel, into a stone tablet, with no compilers or assemblers in sight!
tkz1312 24 hours ago [-]
The bug was a simple segfault and did not result in the production of invalid or incorrect code.
saghm 27 minutes ago [-]
Segfaults are the lucky case when you run into undefined behavior. The unlucky case is that you just get a program doing something different than what you intended without any clear indication that something went wrong.
charcircuit 1 days ago [-]
You can fix smart contracts without a hard fork. In fact it's common practice for them to be upgradable.
saghm 1 days ago [-]
Does it fix them only going forward, or does it actually update what happened in previous transactions retroactively? If the latter, does everyone involved with any transactions using the said contract have a say in whether the upgrade occurs? I'm skeptical that this would actually be a good mechanism in general given how likely it seems that some contracts might be so widely used to essentially require a hard fork in order to upgrade retroactively, but I'm open to the idea that I might be missing something here.
charcircuit 1 days ago [-]
It only updates the code that will run. Fixing it going forward. To protect oneself against losses due to bugs people can use insurance smart contracts.
saghm 25 minutes ago [-]
I guess I'm just skeptical that writing more code to try to protect against code being buggy will result in a system that works better than what we have today. Software with tests written for it still has plenty of bugs, after all.
charcircuit 21 minutes ago [-]
Expecting code to be perfect and code complete the first time around is typically a losing bet.
twoodfin 1 days ago [-]
It’s less about convenient syntax and more about simplifying the construction of abstractions.
You could argue that the latter is the core drive to evolve the standard.
mgaunard 1 days ago [-]
Every new version of C++ (and sometimes even new versions of C++ compilers) can break code in subtle ways.
You should always do extensive testing before upgrading.
vlovich123 23 hours ago [-]
I have not observed the same with other languages, whether that’s Java, Rust, or Swift.
And note this is about an older compiler version using an older library with a “newer” language spec (ie been around 5 years). If you use up to date toolchains there’s less of an issue. This is yet another perennial weakness in the c++ ecosystem that isn’t present in other languages because the standard committee continues to abrogate their duties here.
reactordev 1 days ago [-]
That’s the crux of the issue. People fear upgrading because of the bugs introduced by more complexity for more terse syntax.
How much longer will you suffer?
mgaunard 1 days ago [-]
The same is true for any software. If you care about reliability, you pin, and carefully test before upgrading.
immibis 1 days ago [-]
It's not slightly, it's substantially more complicated to become substantially more convenient. The leap from C to C++ is similar to the leap from assembly code to C. As you add features, the language becomes more complex. That's just how it is.
Most languages deal with this by limiting the features they have - but not C++! (or Scala, which is like Java's C++)
bri3d 1 days ago [-]
This is a perfect summary of C++: substantially more complicated for substantial convenience.
Add Boost to the mix, as in this bug, and C++ becomes quite ludicrous really, but it can also combine efficiency, interoperability, and performance in a way that's quite difficult to achieve in other languages.
IMO, starting a good C++ project is an exercise in defining a specific language based on a subset of C++ more than anything else: done right, it can be a big lever, done wrong, it can be a big foot-gun.
tialaramex 1 days ago [-]
If we make the underlying mistake in Rust (actually I did last week for unrelated code) by defining an operator implementation as simply calling itself rather than calling some concrete implementation, the compiler warns us, this is infinite recursion which probably wasn't what you wanted.
G++ has to emit code here to recurse infinitely, so it's not as though there's no opportunity to say "Huh, that probably won't be what they meant"
jcelerier 1 days ago [-]
C++ compilers had warnings about infinite recursion for years now (since at least 2015 for clang)
tialaramex 1 days ago [-]
Fair. That's actually crazy - I dug into this more and found the chosen Boost package fails tests on the chosen compiler, and so if they'd done even a cursory check to see if what they're doing could make sense there'd be red flags everywhere.
So the overall story is mostly "Our QA is garbage".
saalweachter 10 hours ago [-]
If it's at the "we never compile with -Wall -Werror" level of non-existent QA, I feel like that's an insult to garbage.
tialaramex 8 hours ago [-]
-Wall -Werror means that instead of being able to be used as a ratchet the set of "all" warnings in C and C++ became rusted in place. I sympathise with the instinct to write this in your compiler flags, but alas thanks to Hyrum's law this makes the future worse and so today people ask that you -Wextra and no doubt I'll eventually read of how C is fine so long as you -Wall -Wextra -Wevenmore -Wyetmore -Wandthese
userbinator 16 hours ago [-]
It becomes substantial inconvenience as soon as you need to try to figure out why something doesn't work the way you thought it should.
ranger_danger 1 days ago [-]
You don't have to use all the fancy new features though. Personally I just stick to C++98 for the most part, unless I really want to use auto or lambdas for some reason, then I might use 11, but I won't go higher.
whizzter 1 days ago [-]
C++ 11 with lambdas is so much nicer than 98, but even if 14, 17, 20 and 23 have been minor in comparison the accumulated changes have been radical in making what was introduced in 11 into something really useful.
Making anything remotely useful out of templates was deeply into hideous SFINAE-land (substitution-failure-is-not-an-error) leading to horrible unreadable templates during 98 (and the variable-number templates in 11 while being useful only made it explode in horrendeous hacks).
"if constexpr" from C++ 17 and forward actually makes template-expansions not too horrid. you can do template expansions that reads mostly like regular if-elseif-else clauses (and I think that was perhaps part of the impetus for Zig's comptime, starting from this point with a clean slate instead of going through regular template evolution hell).
addaon 20 hours ago [-]
C++ pre- and post- C++11 are really different languages. It’s by far the biggest break (so far in my career) in how the language is used. I’ll gladly debate the detailed trade-offs of many of the newer features, but going back to a world before rvalue references is too horrifying to consider.
jeffbee 1 days ago [-]
The horrendous hacks went away with concepts. SFINAE is dead. Spaceship operator, while perhaps not ideal from every perspective, radically simplifies the workload of implementing comparators.
saagarjha 24 hours ago [-]
Assuming the thing you want actually has a concept, sure.
majoe 23 hours ago [-]
You can define your own concepts. What is your point?
I haven't seen an example yet, where concepts weren't simpler and easier to read/understand than SFINAE shenanigans.
jeffbee 22 hours ago [-]
Not only can you define concepts, it is highly entertaining and educational. You can write a concept that makes a function only available on Tuesdays, for example.
ranger_danger 22 hours ago [-]
I don't use templates hardly at all, and I find the added changes in later versions to seriously complicate what I would prefer to be much simpler, for example initialization rules... so that's one of several reasons I like to stick to older C++ versions.
bdhcuidbebe 16 hours ago [-]
Ah yea, why not settle with the 14 year old version?
The MAGA version.
sillysaurusx 1 days ago [-]
Eh, you’re probably missing out. I remember C++20 being really convenient in comparison, though I forget why.
saalweachter 10 hours ago [-]
Eyeballing C++ 20, struct initialization with .member = blah is the one that jumps out.
But also, since the referenced bug involves Boost... I also haven't used Boost in years, since most everything I was using got pulled into modern C++ versions/STLs.
So if you're using C++98 and Boost, do yourself a favor and just switch to C++17 or 20 or whatever.
wavemode 1 days ago [-]
Scala has the advantage (and disadvantage) of not caring one lick about backwards compatibility. They were perfectly happy to burn it all to the ground and redo the language's syntax in v3. And even point releases of Scala frequently have breaking changes.
seanhunter 7 hours ago [-]
I respect that, but kind of lost patience with it when a new version of the compiler decided that my types were no longer decidable in code that hadn’t been touched for years. I tried for about 5 mins to fix it and then just thought “It’s not me, it’s you” and put scala down never to pick it up again.
vlovich123 23 hours ago [-]
I have observed significantly better useful features from Rust or Swift (or Scala) without horribly complicating the language like C++ does. More importantly, they migrate existing codebases and have migration tools instead of breaking old versions of syntax. Rust’s is the most ambitious here letting you mix match editions of the language within a single binary.
DokDidhuAd 22 hours ago [-]
Yeah, C++ is just batshit insane.
> In C++, when you write an expression like a == b, the compiler chooses among available operator== implementations by comparing their match quality. A member function like a.operator==(b) usually has higher priority than a non-member function like operator==(a, b) — unless the types differ too much or are ambiguous.
This is the largest foot-bazooka ever. The point of operator overloading is that the operator in a given expression look natural, feel natural, and behave naturally (= function intuitively). If you have multiple possible operators that could apply, for the same syntax, then you're in a worse position than you originally were -- where an operator either did one particular thing, or didn't apply at all. What operator overloading does in practice is introduce ambiguity. Which is self-defeating. You are better off with C-style, named, non-overloaded functions.
You will never find this crap in actual math. "match quality" my ass.
maccard 20 hours ago [-]
> You will never find this crap in actual math
Sure you do. What’s 6/2(1+2)
kazinator 4 hours ago [-]
It gets more confusing when those two compilers are the same one. Compiler bug causes compiler bug elsewhere in same compiler, which causes bug in compiled image of some library function used by the compiler, which causes another bug in the compiler when the compiler is re-compiled again this time with the compilation relying on the wrong compiled version of the function rather than the correct interpreted one. This second compiler bug affecting something is what you notice first, and is your investigative starting point.
For most people in computing, a side project is the only way to experience this kind of thing.
metadat 24 hours ago [-]
What is the appeal / high utility use-case for the spaceship "<=>" operator? It seems quite unintuitive to me.. too many doodads is a turn off, like a car with excessive modifications. Does continually adding more then more more more eventually become a stressful nightmare of complexity?
For a concrete example of what this looks like, check out the Homer Simpson -designed "everything" car.
p.s. Fascinating bug! One of the most interesting cases I've encountered.
tialaramex 23 hours ago [-]
The spaceship operator is an attempt to achieve the same thing as Rust's PartialOrd -- to have a single authoritative place to explain the ordering of a type.
Historically C++ only had these separate comparison operators, and operator overloading, and so if you want ordering you'd be expected to override all of the operators and do so in a consistent way. Turns out it's easy enough to get that wrong and if you do now your C++ program is often nonsense, it has no defined meaning whatsoever. For example if a < b && b < a then too bad now many built-in library functions might do absolutely anything in say C++ 17.
With the spaceship operator you're still screwed if your type provides an incoherent ordering (unlike Rust where that just means the type is less useful) but it's much more likely you'll write something which actually works.
maxlybbert 22 hours ago [-]
When the STL became part of the standard library ( http://www.stlport.org/resources/StepanovUSA.html ), there was a question of how to handle algorithms that sort containers, or that perform a binary search, or in some other way, need to know whether “a” is less-than, equal-to, or greater-than “b”. The algorithms have to work on primitive types, and user defined types as efficiently as possible. They eventually decided to only require a function for “less-than.” And if “a < b” returns false, and “b < a” also returns false, then “a” and “b” are considered equal.
There are times that doesn’t work, so documentation usually has a footnote that (1) certain algorithms require a partial ordering and not necessarily a total ordering, and (2) to use those algorithms, you must implement less-than, but any other comparison operator is ignored by the algorithm; instead, less-than is used to figure out greater-than and equal-to as needed. This was considered better than requiring programmers to implement a collection of comparison operators, and trusting those programmers to make those operators act consistently with each other (e.g., never say that “a” and “b” are both less-than and greater-than each other).
The spaceship operator seems to address this specific case ( https://open-std.org/jtc1/sc22/wg21/docs/papers/2017/p0515r0... ). According to Herb Sutter (note that his name is on the proposal), “We added the C++20 spaceship operator to the language, but we also applied it throughout the C++ standard library and that made the library specification nearly 20 pages shorter — a net reduction” ( https://herbsutter.com/2020/12/ ).
spyrja 21 hours ago [-]
Way back when I actually perused through quite a bit of Stepanov's STL code and stumbled on that very thing, the comparison operator semantics. I remember thinking it very clever of him to approach it that way and it hadn't even crossed my mind at the time that it might lead to some kind of undefined behavior. Thanks for pointing out such an interesting tidbit!
maxlybbert 19 hours ago [-]
Maybe "doesn't work" is the wrong phrase. Usually, it does what people expect (except that the library ignores any "operator==" or "operator!=", which surprises people who went through the trouble to define them). And depending on what "operator<" does, it's possible for distinct values (a.k.a., not-equal values) to compare "equal." So to avoid confusion, people start using phrases like "equivalent" instead of "equal." But usually there's no real confusion: if you sort a list of strings by length, nobody's surprised when "hello" and "green" compare "equal," even though they have different contents. Everybody realizes that they have equal lengths, not equal contents.
Warning, not error, and if you must license the 6-line trivial example, just use CC0-1.0
layer8 22 hours ago [-]
It’s just example code in a blog post though…
sali0 18 hours ago [-]
Not having this statement results in a compile time error in solidity.
yjftsjthsd-h 17 hours ago [-]
Even if that's true: Okay, so why "UNLICENSED" instead of "CC0-1.0"? Especially for a trivial example? Or I guess MIT or something if you really care about something at the same level as hello world?
dejj 1 days ago [-]
Would the license help to prevent an AI training on the example code?
qualeed 1 days ago [-]
If they eventually start paying attention to licenses, maybe?
LtWorf 1 days ago [-]
No, AI companies are apparently exempted from copyright laws.
edit: bring on the downvotes… doesn't change the fact that fb illegally downloaded a lot of material to train, and so did every other AI company.
bdhcuidbebe 16 hours ago [-]
You are right.
typpilol 1 days ago [-]
I saw that too and had to double take lol
vhantz 7 hours ago [-]
A better title would be "Upgrading underlying language standard without even minimal testing brings down compiler".
jokoon 15 hours ago [-]
I like c++, but it is so complex to compile that this situation was bound to happen.
I wish new C++ standards would make old c++ not work: old code could still be compiled with old compilers, and it would encourage developers to fix their code or rewrite it.
C++ needs it's python2to3 moment.
Cpp2/cppfront feels a bit like that.
I don't mean to say that c++11 to 20 are bad, but they are probably expensive and fastidious to implement.
graemep 6 hours ago [-]
> C++ needs it's python2to3 moment.
No one needs a python 2 to 3 moment. It was not a moment, it was a painful decade or so.
There might be a better way to handle changes in C++, but being like Python is not a good way to go.
cesaref 12 hours ago [-]
This situation happens with plenty of compilers for plenty of languages. It's just software after all, we don't expect any software to be perfect, so why expect compilers to be perfect?
As for deprecating old C++, yes, I can see the appeal, but no, the community hasn't gone that route. C++ has always been a 'mix and match' approach, with different parts of the community using more or less of the language features, with the intention being that you can choose what to adopt when, and move older code bases forward (or not!) depending on individual concerns.
It's one of the C++ strengths, although it doesn't always feel like it.
questionaaire 1 days ago [-]
My only question is how this operator== override eluded the g++ test suite:
The better question, in my opinion, is how many other known defects are just sitting there in the GNU buganizer with good reports for more than a decade.
bdhcuidbebe 16 hours ago [-]
Fewer than before this was found. Lets celebrate the win huh.
lostmsu 8 hours ago [-]
> This post unpacks how this happened — and why none of the individual components are technically "broken": 1. A 12-year-old overload resolution bug in G++
I don't know what author meant here. The whole issue exists because G++ overload resolution mechanism was broken.
Seems like the free comparison function in boost rational should have been constrained to non-rationals
i.e. !is_same_v<rational, U>
ethan_smith 13 hours ago [-]
The constraint should be `template <typename T> requires (!std::is_same_v<T, rational<>>)` since the spaceship operator's rewriting rules for mixed-type comparisons create the ambiguity that triggered the compiler bug.
moonlet 1 days ago [-]
I brain-typo’d the title into a 12-year-old girl’s bug taking down Solidity and this, frankly, does not live up to that hype
sylware 12 hours ago [-]
Come on, with a bit of irony, they have their first major bug: they are pulling a c++ compiler (either gcc or clang).
> The lesson? Always test critical software under multiple compilers and library versions — especially when enabling a new language standard.
Don't have giga-complicated language jockey stuff backing software that can't afford to even have one bug.
The only one that needs to compile Solidity is the person who wrote the Solidity, right? Same with non-blockchain software. So a bug in a Solidity compiler will have the same impact as a bug in any other compiler. At least with regard to executing untrusted code.
Yes, you can fix issues like this with a "hard fork" if you have a large enough consensus, but at that point, does the system of having smart contracts actually improve anything over the one where the software is downstream rather than the source of truth, or are you just replacing one form of "human intervention required" with a different but worse one?
You can usually pick the jurisdiction (e.g. many contracts are under jurisdiction of England & Wales/Switzerland/Singapore despite neither party being based there, or doing business there).
I think in general judges are quite considered, and the answers you get from courts tend to make quite a lot of sense, especially if you are aware of general legal principles and precedent.
You could argue that the latter is the core drive to evolve the standard.
You should always do extensive testing before upgrading.
And note this is about an older compiler version using an older library with a “newer” language spec (ie been around 5 years). If you use up to date toolchains there’s less of an issue. This is yet another perennial weakness in the c++ ecosystem that isn’t present in other languages because the standard committee continues to abrogate their duties here.
How much longer will you suffer?
Most languages deal with this by limiting the features they have - but not C++! (or Scala, which is like Java's C++)
Add Boost to the mix, as in this bug, and C++ becomes quite ludicrous really, but it can also combine efficiency, interoperability, and performance in a way that's quite difficult to achieve in other languages.
IMO, starting a good C++ project is an exercise in defining a specific language based on a subset of C++ more than anything else: done right, it can be a big lever, done wrong, it can be a big foot-gun.
G++ has to emit code here to recurse infinitely, so it's not as though there's no opportunity to say "Huh, that probably won't be what they meant"
So the overall story is mostly "Our QA is garbage".
Making anything remotely useful out of templates was deeply into hideous SFINAE-land (substitution-failure-is-not-an-error) leading to horrible unreadable templates during 98 (and the variable-number templates in 11 while being useful only made it explode in horrendeous hacks).
"if constexpr" from C++ 17 and forward actually makes template-expansions not too horrid. you can do template expansions that reads mostly like regular if-elseif-else clauses (and I think that was perhaps part of the impetus for Zig's comptime, starting from this point with a clean slate instead of going through regular template evolution hell).
I haven't seen an example yet, where concepts weren't simpler and easier to read/understand than SFINAE shenanigans.
The MAGA version.
But also, since the referenced bug involves Boost... I also haven't used Boost in years, since most everything I was using got pulled into modern C++ versions/STLs.
So if you're using C++98 and Boost, do yourself a favor and just switch to C++17 or 20 or whatever.
> In C++, when you write an expression like a == b, the compiler chooses among available operator== implementations by comparing their match quality. A member function like a.operator==(b) usually has higher priority than a non-member function like operator==(a, b) — unless the types differ too much or are ambiguous.
This is the largest foot-bazooka ever. The point of operator overloading is that the operator in a given expression look natural, feel natural, and behave naturally (= function intuitively). If you have multiple possible operators that could apply, for the same syntax, then you're in a worse position than you originally were -- where an operator either did one particular thing, or didn't apply at all. What operator overloading does in practice is introduce ambiguity. Which is self-defeating. You are better off with C-style, named, non-overloaded functions.
You will never find this crap in actual math. "match quality" my ass.
Sure you do. What’s 6/2(1+2)
For most people in computing, a side project is the only way to experience this kind of thing.
For a concrete example of what this looks like, check out the Homer Simpson -designed "everything" car.
https://media.wired.com/photos/593252a1edfced5820d0fa07/mast...
p.s. Fascinating bug! One of the most interesting cases I've encountered.
Historically C++ only had these separate comparison operators, and operator overloading, and so if you want ordering you'd be expected to override all of the operators and do so in a consistent way. Turns out it's easy enough to get that wrong and if you do now your C++ program is often nonsense, it has no defined meaning whatsoever. For example if a < b && b < a then too bad now many built-in library functions might do absolutely anything in say C++ 17.
With the spaceship operator you're still screwed if your type provides an incoherent ordering (unlike Rust where that just means the type is less useful) but it's much more likely you'll write something which actually works.
There are times that doesn’t work, so documentation usually has a footnote that (1) certain algorithms require a partial ordering and not necessarily a total ordering, and (2) to use those algorithms, you must implement less-than, but any other comparison operator is ignored by the algorithm; instead, less-than is used to figure out greater-than and equal-to as needed. This was considered better than requiring programmers to implement a collection of comparison operators, and trusting those programmers to make those operators act consistently with each other (e.g., never say that “a” and “b” are both less-than and greater-than each other).
The spaceship operator seems to address this specific case ( https://open-std.org/jtc1/sc22/wg21/docs/papers/2017/p0515r0... ). According to Herb Sutter (note that his name is on the proposal), “We added the C++20 spaceship operator to the language, but we also applied it throughout the C++ standard library and that made the library specification nearly 20 pages shorter — a net reduction” ( https://herbsutter.com/2020/12/ ).
The main things I've actually had to watch out for are floating point values that could have not-a-numbers (e.g., https://en.cppreference.com/w/cpp/numeric/math/isunordered.h... ), and possibly Unicode strings that aren't normalized ( https://unicode.org/reports/tr15/ ), but those act weird in all software I'm familiar with. Because people forget about the relevant edge cases.
Other than that, simply not specifying any license would be equivalent.
https://stackoverflow.com/questions/68332228/spdx-license-id...
edit: bring on the downvotes… doesn't change the fact that fb illegally downloaded a lot of material to train, and so did every other AI company.
I wish new C++ standards would make old c++ not work: old code could still be compiled with old compilers, and it would encourage developers to fix their code or rewrite it.
C++ needs it's python2to3 moment.
Cpp2/cppfront feels a bit like that.
I don't mean to say that c++11 to 20 are bad, but they are probably expensive and fastidious to implement.
No one needs a python 2 to 3 moment. It was not a moment, it was a painful decade or so.
There might be a better way to handle changes in C++, but being like Python is not a good way to go.
As for deprecating old C++, yes, I can see the appeal, but no, the community hasn't gone that route. C++ has always been a 'mix and match' approach, with different parts of the community using more or less of the language features, with the intention being that you can choose what to adopt when, and move older code bases forward (or not!) depending on individual concerns.
It's one of the C++ strengths, although it doesn't always feel like it.
https://osec.io/blog/2025-08-11-compiler-bug-causes-compiler...
The better question, in my opinion, is how many other known defects are just sitting there in the GNU buganizer with good reports for more than a decade.
I don't know what author meant here. The whole issue exists because G++ overload resolution mechanism was broken.
i.e. !is_same_v<rational, U>