I'm probably going to make a few enemies with this opinion, but I think modern C++ is just an utterly broken mess of a language. They should have just stopped extending it after C++11.
When I look at C++14 and later I can't help but throw my hands up, laugh and think who, except for a small circle of language academics, actually believes that all this new template crap syntax actually helps developers?
Personally I judge code quality by a) Functionality (does it work, is it safe?), b) Readability c) Conciseness d) Performance and e) Extendibility, in this order, and I don't see how these new features in reality help move any of these meaningfully in the right direction.
I know the intentions are good, and the argument is that "it's intended for library developers" but how much of a percentage is that vs. just regular app/backend devs? In reality what's going to happen is that inside every organization a group of developers with good intentions, a lack of experience and too much time will learn it all and then feel the urge to now "put their new knowledge to improve the codebase", which generally just puts everyone else in pain and accomplishes exactly nothing.
Meanwhile it's 2021 and C++ coders are still
- Waiting for Cross-Platform standardized SIMD vector datatypes
- Using nonstandard extensions, libraries or home-baked solutions to run computations in parallel on many cores or on different processors than the CPU
- Debugging cross-platform code using couts, cerrs and printfs
- Forced to use boost for even quite elementary operations on std::strings.
Yes, some of these things are hard to fix and require collaboration among real people and real companies. And yes, it's a lot easier to bury your head in the soft academic sand and come up with some new interesting toy feature. It's like the committee has given up.
Started coding C++ when I was 14 -- 22 years ago.
> - Waiting for Cross-Platform standardized SIMD vector datatypes
which language has standardized SIMD vector datatypes ? most languages don't even have any ability to express SIMD while in C++ I can just use Vc (https://github.com/VcDevel/Vc), nsimd (https://github.com/agenium-scale/nsimd) or one of the other ton of alternatives, and have stuff that JustWorksTM on more architectures than most languages even support
- Using nonstandard extensions, libraries or home-baked solutions to run computations in parallel on many cores or on different processors than the CPU
what are the other native languages with a standardized memory model for atomics ? and, what's the problem with using libraries ? it's not like you're going to use C# or Java's built-in threadpools if you are doing any serious work, no ? Do they even have something as easy to use as https://github.com/taskflow/taskflow ?
- Debugging cross-platform code using couts, cerrs and printfs
because people never use console.log in JS or System.println in C# maybe ?
- Forced to use boost for even quite elementary operations on std::strings.
can you point to non-trivial java projects that do not use Apache Commons ? Also, the boost string algorithms are header-only so you will end up with exactly the same binaries that if it was in some std::string_algorithms namespace:
Most of what you said is a fair retort, but boost isn't quite as rosy as you make it seem. It's great but it has serious pitfalls which is why many C++ developers really hate it:
A) Boosts supports an enormous amount of compilers & platforms. To implement this support is an enormous amount of expensive preprocessor stuff that slows down the build & makes it hard to debug. B) Boost is inordinately template heavy (often even worse than the STL). This is paid for at compile time. Some times at runtime and/or binary size if the library maintainers don't do a good job structuring their templates so that the inlined template API calls a non-templated implementation. The first C++ talk I remember talking about this problem was about 5-7 years ago & I doubt boost has been cleaned up in its wake across the board. C) Library quality is highly variable. It's all under the boost umbrella but boost networking is different from boost filesystem, different from boost string algorithms, different from boost preprocessor, boost spirit, etc. Each library has its own unique cost impact on build, run, & code size that's hard to evaluate a priori.
Boost is like the STL on steroids but that has its own pitfalls that shouldn't be papered over. Maybe things will get better with modules. That's certainly the hope anyway.
> which language has standardized SIMD vector datatypes ?
Java is getting it soonish. https://openjdk.java.net/jeps/338
Rust has it (but it's fairly platform specific) https://doc.rust-lang.org/edition-guide/rust-2018/simd-for-f...
Dart has it https://www.dartcn.com/articles/server/simd
Javascript has it https://01.org/node/1495
It's actually a bit impressive how many languages have it at this point.
> what are the other native languages with a standardized memory model for atomics
Rust, C, Go?
> It's not like you're going to use C# or Java's built-in threadpools if you are doing any serious work, no ?
Define "serious". By most metrics JVM apps run at 1->2x the speed of C++, that's really not terribly slow for a managed language. On top of that, there are a lot of places java can outperform C++ (high heap memory allocation rates). Java's threadpools and concurrency model is, IMO, superior to C++'s.
> Do they even have something as easy to use as taskflow
Several internal and external libs do. Java's completable futures, kotlin's/C#'s (and several other languages) async/await. I really don't see anything special about taskflow.
> can you point to non-trivial java projects that do not use Apache Commons
Yes? It's a fairly dated lib at this point as the JDK has pulled in a lot of the functionality there and from guava. We've got a lot of internal apps that don't have Apache commons as a dependency. I think you are behind the times in where Java as an ecosystem is now.
... I just checked your link and wouldn't say that any of these languages have SIMD more than C++ has it currently:
- Java: incubation stage (how is that different from https://github.com/VcDevel/std-simd). Also Java is only getting it soonish for... amd64 and aarch64 ??
- Rust: those seem to be just the normal intrinsics which are available in every C++ compiler ?
- Dart: seems to not go beyond SSE2 atm ? But it looks like the most "officially supported" of the bunch
- Javascript: seems to be some intel-specific stuff which isn't available here on any of my JS environments ?
* Standardized memory model
- Literally false for Rust : https://doc.rust-lang.org/reference/memory-model.html
- The C11 one directly comes from C++: https://stackoverflow.com/a/8877562/1495627
- The Go one does not seem to support acquire-release semantics, which makes it quite removed from e.g. ARM and NVidia hardware from what I can read here ? https://golang.org/pkg/sync/atomic/
That's quite well thought out; without the compile-time checks for operations existing, you end up with code either needing to target a very small subset of the operations that are widely supported or something that is not really cross-platform -- I've seen too much of the following using what is theoretically portable code because software-fallback will typically be an order of magnitude worse than using a different set of datatypes and operators
#if defined(__NEON__)
"portable" SIMD goes here
#elif defined(__ALTIVEC__)
different "portable" SIMD goes here
...> which language has standardized SIMD vector datatypes
C# https://docs.microsoft.com/en-us/dotnet/standard/simd and https://devblogs.microsoft.com/dotnet/hardware-intrinsics-in...
I hope they keep going down this path and make it into a real mess of a language, so that people can finally stop pretending C++ is the solution to any problem, when it is in fact the cause of a lot of your problems.
I began C++ coding over 20 years ago as well, and it required reading thick books even then. I remember my class mates at Uni really hated software development all because of C++. It was way too hard as a beginners language, even 20 years ago.
I look at all these new features, and I am like: How on earth are you going to teach all this crap to students?
They have painted themselves into a corner. It becomes a language only for those who have already programmed it for 10-20 years.
This idea, that it is only for library developers is a bunch of crap. A lot of learning a language is really about reading the code for the standard library. That was one of the beauties of writing Go code. You regularly look at standard library code and is even encouraged to do so. It teaches you a lot about good style.
Same deal with I program in Julia. Looking at library code is totally normal and common.
Except in C++. I avoided looking at library code like the plague. And I suppose, now it will only get worse.
The worst part of this is that this isn't just a problem for C++ developers but also for everybody else. So many key pieces of software relies on C++ code. It becomes ever harder to migrate that code or interface with that code as C++ complexity grows.
That was the beauty of a language like Objective-C. Unlike C++ it is a fairly simple language which you can interface easily with. The result was that porting to Swift was really easy. When porting iOS apps to Swift I could pick individual functions and rewrite them to Swift.
There is no hope doing anything like that with C++.
> I look at all these new features, and I am like: How on earth are you going to teach all this crap to students?
You don't. You teach "A tour of C++ 2nd edition"[0] which presents a clean and smaller subset of the language people can wrap their mind around, with everything someone new to modern C++ needs to know to be effective. And you supplement this with "C++ Core Guidelines"[1] which can be enforced by code analysis and provide some examples of common mistakes or questions people might have.
You do not need to know all the details of the language and know every single features. And wouldn't teach everything to a student.
But it's true that there is some overhead due to the complexity of the language.
[0]: https://www.stroustrup.com/tour2.html
[1]: https://isocpp.github.io/CppCoreGuidelines/CppCoreGuidelines
> I'm probably going to make a few enemies with this opinion, but I think modern C++ is just an utterly broken mess of a language. They should have just stopped extending it after C++11.
This is the popular refrain of the day, so I don't know why you cage this as if you're saying something controversial.
The popular refrain has more to do with the lack of memory security features in the language, although I'm sure they will bolt a borrow checker or something on to the language.
There are currently enclaves of developers who know varying versions of C++. There's a good chance that a 20-year C++ veteran would have to consult the documentation for syntax. That's concerning. Defining what something isn't is nearly always more important than defining what it is, and C++ is seemingly trying to be everything.
This is a popular (and increasing) trend in HN comments.
"It is a poor craftsman who blames his tools."
This is a common saying because it is a common occurrence.
People who use the language effectively know all about the complaints. Those people live with their complaints knowing no other language even comes close to meeting their needs. No language on the horizon is even trying to meet their needs.
C++ usage is still growing by leaps and bounds. Attendance at ISO Standard meetings is soaring; until Covid19 killed f2f meetings, each had more than any meeting before; similarly, at conventions. Even the number of C++ conventions held grows every year, with new national ones arising all the time.
Rust is having a go at part of the problem space, and making some headway. But more people pick up C++ for the first time in any given week than the total who have ever tried Rust. It is still way too early to tell whether that will ever not be true.
So the HN trend is very much an echo-chamber phenomenon, with no analog in the wider world.
True, but there's plenty of Stockholm Syndrome as well. C++ is a mess, and there's people that will defend that mess to the end of times. Those people managed to get pretty good and have a deep understanding of all of its quirks, but lack the ability to take a seat back and admit that yes, nobody without masochistic tendencies would get into C++20, unless they're already familiar with it.
> except for a small circle of language academics
I'm sorry but can we stop hating on "academics"? No one in research matches your description. The intersection of academia and C++ contains only practitioners (like in the industry), who just want their code to work; and maybe some verification people who'd rather wish C++ was smaller because it is a hell of a beast to do static analysis on. Both these categories are real people having real use cases. The programming language crowd is generally more interested in stuff like dependent types or effect systems, not templates.
> soft academic sand
shrug.
If you replace 'academic' with the secondary definition: "not of practical relevance; of only theoretical interest." it is probably true though. Having known some of the C++ standard contributors, they strongly defend themselves against the "not of practical relevance" part with "look what I wrote". Sure it's clever but adding language features just to say "look what I wrote, it's clever is no excuse for building a language that's become a train wreck.
(I have been coding in C++ on and off professionally since 1985 and I do like some of the C++11 and c++14 features. The pointer improvements are great but the template stuff is a complete joke on us).
Sure it's clever but adding language features just to say "look what I wrote, it's clever is no excuse for building a language that's become a train wreck.
Actually, the rationale behind the language features you're criticizing is that people in the real world were already using some techniques in C++ in a needlessly complex and convoluted way, and these new additions not only simplify these implementations but also allow the compilers to output helpful, user-friendlier messages.
Take concepts, for example. You may not like template metaprogramming, but like it or not they are used extensively in the real-world, in the very least in the form of STL and Eigen. Template metaprogramming is a central feature of C++ consumed by practically each and every single C++ developer, in spite of rarely producing code them. Does it make any sense at all to criticize work to improve a key feature that benefits each and every C++ programmer, in spite of not having to write code with it?
And no one of sane mind would argue in favour of shoehorning #include and #ifndef/#define in detriment to a proper module system.
Just because you aren't familiar or well-versed with some C++ features, or aware of how extensively they are used, it doesn't mean they are not used or that the stuff you don't know automatically qualifies as a trainwreck.
The #1 feature I currently want is the ability to do an implicit lambda capture of a structured binding, at least by reference. I appreciate there are interesting corner cases of like, bindings to bitfields: I simply don't need those corner cases solved... if it just supported a handful of the most obvious cases I would be so so so happy, and then they can spend the next decade arguing about how to solve it 99% (which I say as we know it won't be 100%... this is C++, where everything is some ridiculous epicycle over a previous failed feature :/).
(edit:) OMG, I found this feature in the list!! (It was in the set of structured bindings changes instead of with the changes to lambda expressions, which I had immediately clanged through.) I need to figure out now what version of clang I need to use it (later edit: I don't think clang has it yet; but maybe soon?)... this is seriously going to change my life.
https://oleksandrkvl.github.io/2021/04/02/cpp-20-overview.ht...
This thankfully made it into C++20.
However a full destructuring bind, à la Lisp, hasn't. You can't do `for (auto& [a, [b, c]] : some_container_of_structs)` which is handy for taking apart all sorts of things.
Relatedly there's no "ignore" though it exists in function declaration syntax: you can write `void foo (char the_char, int, long a_long);`. But you can't ignore the parts of a destructure you don't need: `auto& [a, , c]`. This capability is sometimes useful in the function declaration case but is quite handy, say, when a function returns multiple values but you only need one (consider error code and explanation).
And variadic destructuring...well I could go on.
I haven't attended a C++ committee meeting in 25 years (and didn't do a lot when I did) so I have no reason to complain.
Destructuring that lets you ignore parts of the object is usually found in the form of pattern matching.
Lisp destructuring comes directly from macros: CL's destructuring lambda lists and macro lambda lists are closely related cousins.
Macros usually care about all their arguments. Reason being, they are designed to cater to those arguments; an unnecessary element in the syntax of a macro will just be left out from its design, rather than incorporated as a piece of structure that gets ignored. (The exceptions to it are reasonably rare that it's acceptable to just capture a variable here and there and ignore it.)
Yeah: 100% to these complaints; I do run into the full destructuring issue occasionally, but it isn't blocking me ability to do composition of features in the same way this lambda capture issue is ;P.
One day we will get it. I believe the intention is to support full destructuring but it is hard to get a feature added to the standard. Sometimes functionality is cut just to increase the probability that it will be voted in.
For example lambdas were added in C++11, but generic lambdas were cut out and only added in C++14.
What grumby refers to seems trivial to implement.
"If you don't see anything between the commas, then fill it in with a compiler-generated symbol."
>> this is seriously going to change my life.
Now I'm curious. Can you give a small code example of the kind of thing this solves and how it will change your life? ;-)
I constantly use both lambdas and structured bindings; without this feature, I am having to constantly redeclare every single not-a-variable I use in every lambda level and then maintain these lists every time I add (or remove, due to warnings I get) a usage. Here is one of my lambdas:
nest_.Hatch([&, &commit = commit, &issued = issued, &nonce = nonce, &v = v, &r = r, &s = s, &amount = amount, &ratio = ratio, &start = start, &range = range, &funder = funder, &recipient = recipient, &reveal = reveal, &winner = winner]() noexcept { return [=]() noexcept -> task<void> { try {
And like, at least there I am able to redeclare them in a "natural" way... I also tend to hide lambdas inside of macros to let me build new scope constructs, and if a structured binding happens to float across one of those boundaries I am just screwed and have to declare adapter references in the enclosing scope (which is the same number of name repetitions, but I can't reuse the original name and it uses more boilerplate).
Ah I see, yes that's horrible.
It's kind of weird structured bindings where not captured with [=](){} before, actually. I'm still stuck at C++11 for most of my work so I cannot use structured bindings at all, but I would not have expected to have to write that kind of monstrosity in C++17
Out of curiosity, what kind of domain is this?
I believe GCC has supported this for a while now, even before it was added to the list of features for C++20.
Yeah... I did know gcc allowed it, but I didn't know it was because the spec now allowed it and not that they were just doing it anyway. Sadly, I am heavily heavily using coroutines (--even coroutine lambdas... with captures structured bindings ;P (don't try to auto template them though: that crashes the compiler)--which clang has much better support for.
I hope one day we can get a widely adopted C and C++ package manager. The friction involved in acquiring and using dependencies with odd build systems, etc. is one of the things I dislike about the language. I’m aware on Linux things are a bit easier, but if it were as easy as “npm install skia”, etc. everywhere, I think many people would use the language more. Rust has package management, but not the ecosystem yet. On the other hand, C/C++ has the ecosystem, but no standard way to easily draw from it.
Widely adopted source code manager requires a widely adopted build system. CMake is certainly a contender but the ecosystem is too fragmented even then & you have to do a lot to try to link disparate build systems together. Also C++ is a transitive dependency hell nightmare & any attempt to solve that (like Rust has) would break every ABI out there. Given how bumpy such breakages have been in the past, I don't think there's any compiler maintainer eager for it (even MSVC has decided to largely ossify their STL runtime ABI).
Conan is certainly a laudable attempt at something like this. Without access to their metrics though, it's hard to tell if they're continuing to gain meaningful traction or if their growth curve has plateaued. It's certainly not in use in any project at medium to bigger size companies I've worked at. By comparison, Cocoapods was pretty successful in the iOS ecosystem precisely because Xcode was the de facto build/project system.
I'm a longtime CMake user, but I think even within the CMake world, the solution is quite a bit more complicated than just "everything needs to be CMake", with a lot of hassles that arise when multiple generations of the tooling is involved, when you're trying to pass down transitive dependencies, when package X has a bunch of custom find modules with magic to try to locate system versions of dependencies but silently fall back to vendored ones.
The higher up the stack you get, the worse and worse these problems get, with high-level packages like Tensorflow being completely intractable:
https://github.com/tensorflow/tensorflow/tree/master/tensorf...
Yup. 100% agree. I totally overlooked the shitshow you'll have managing the different versions of CMake a build might require. Somehow Bazel manages to escape that mess. I think that might be a better foundation, but getting everyone to port to that... it's a tall ask & there's many vocal people who are against improving the build system they work with (hell, I've met many engineers who grumble and strongly prefer Makefiles).
> Without access to their metrics though, it's hard to tell if they're continuing to gain meaningful traction or if their growth curve has plateaued.
Some public data that could be used as proxy for traction:
- Some companies using Conan in production can be seen in the committee for Conan 2.0 called the tribe: https://conan.io/tribe.html. That includes companies like Nasa, Bose, TomTom, Apple, Bosch, Continental, Ansys...
- The public repo for ConanCenter packages, got aprox +3500 pull requests in last year https://github.com/conan-io/conan-center-index/pulls. This doesn't count for contribution to the tool itself.
- https://isocpp.org/files/papers/CppDevSurvey-2020-04-summary... shows a 15% of adoption
- With +1600 subscribers the #conan channel in the CppLang slack is consistently ranked in the most active channels every month: https://cpplang.slack.com/stats#channels
"I think many people would use the language more"
C++ is considered the industry leading language in many fields. I'm not sure how many more you would want (given that those fields that don't use C++ ARE probably better served with some other language).
I agree the build is painfull, but large orgs have for this reason specifically implemented build systems using nugets, conan/cmake or whatnot.
In personal projects I just download the prebuilt binaries of component libraries and drag and drop them to visual studio, minimizing hassle.
If you discard finesse and scalability as requirements you can actually jury rig a C++ project in a jiffy. You just need to let go of the idea that it must be "industry standard setup".
C++ used to be the industry leading language in many more fields, but it lost ground to other languages. Not a bad thing--"know thyself" and all that. But Rust seems like a credible threat to C++'s remaining niches (bury your head in the sand if you want), and C++ will need to evolve if it is to not lose further market-/mindshare. And it is evolving, as this article points out, but a huge glaring pain point in C++ development remains the build and package management tooling. The aforementioned build systems that large organizations operate aren't nearly as nice as, say, Cargo and I think a lot of greenfield projects who have to choose between cobbling together their own build tool to work with C++ and using Rust + Cargo off the shelf will choose the latter (other factors notwithstanding).
I will get worried when NVidia releases CUDA-Rust, and changes their GPGPUs from C++ memory model to Rust, Microsoft decides to rewrite WinUI in Rust, Apple moves Metal from C++ into Rust, or Unreal/Unity get rewritten in Rust.
You write as if Rust vs. C++ was some sort of competition.
I don't understand this - they are not competing brands or sports teams but tools.
Why would it matter and to whom if C++ use would decline?
If use of C++ declines then I don't understand how that would make the language a lesser tool.
Choose the best tool for the job and all that.
>I hope one day we can get a widely adopted C and C++ package manager. [...] , but if it were as easy as “npm install skia”, etc. everywhere,
It's not just the package manager (the command line tool) ... it's the canonical website source that the tool pulls from.
C++ probably won't have a package manager with the same breadth of newer language ecosystems like npm/Nodejs and crates.io/Rust because for 20+ years C++ was developed by fragmented independent communities before a canonical repo website funded by a corporation or non-profit was created. There is no C++ institution or entity with industry-wide influence that's analogous to Joyent (Nodejs & npm) or Mozilla (Crates.io & cargo)
I wrote 2 previous linked comments about this different timeline: https://news.ycombinator.com/item?id=24846012
Tldr, 2 opposite timelines happened:
- C++ for 20+ years of isolated and fragmented development groups creates legacy codebases --> then decades later try to create package manager (vcpkg? Conan? cppget?) that tries to attracts those disparate groups --> thus "herding cats" is an uphill challenge
- npm and crates.io exist at the beginning of language adoption allowing the ecosystem to grow around those package tools and view them as canonical
Go has a perfectly good package manager that works with sources hosted on GitHub and other sites -- there isn't any centralized place for people to publish sources, unlike the other package managers you mentioned.
Go's package manager also came years after the language became widely used, and it is now very widely adopted according to the most recent survey[0].
I think C++ could have a good, unified package management story. It would just require the major stakeholders to all care enough to make it happen, which seems to be the missing piece here.
Go has a small dedicated team that develops and designs the language. They take some input from the broader community but are still the one who decides how things evolve. They decided at some point that go modules was the way to go and everybody followed, because they are the authority who decides how Go evolves.
C++ does not have an equivalent, it's completely decentralized which results in more messy situation. As a result you have an open market where different people try to build different tools and approaches for their own problems, then try to get others to use them (similar to what Go had before go modules, we had lot of package managers to chose from at the time).
Instead of a top down decision it's a negotiation between the various actors. But the last thing we need is for the C++ standards committee to standardize a package manager. That would take forever to do, would result in a messy tool that tries to compromise with all the actors in some ways, make it very hard and slow to evolve over time and would likely result in a lot of pain, etc.
Do the Go sources on GitHub use waf, make, CMake, bazel, or something entirely bespoke? Or is a common build system assumed?
>Go's package manager also came years after the language became widely used, and it is now very widely adopted according to the most recent survey[0].
Are you talking about "pkg.go.dev" and the "go get" command? Isn't there some path dependence in the history of events that's not comparable to C++? Consider:
- Go language: created by Google Inc
- "go get" syntax for package download designed and created by Google Inc
- "pkg.go.dev" funded by Google Inc and highlighted on "golang.org" website that's also run by Google Inc.
There is no business entity or institution in the C++ world that's analogous to Google's influence for Go + golang.org + "go get" + pkg.go.dev.
>It would just require the major stakeholders to all _care_ enough to make it happen,
But it's easier to care if there was an influential C++ behemoth that captured everyone's mindshare to move the entire ecosystem forward. C++ has no such "industry leader" that dictates (or heavily influences) technical direction from the top down.
ABI makes this hard for c++
Build from source
I actually extremely dislike language specific package managers. I'm on Linux, the packages should be in my package manager. I don't want to maintain multiple package managers. nmp is actually the worst here.
> I actually extremely dislike language specific package managers. I'm on Linux, the packages should be in my package manager. I don't want to maintain multiple package managers. nmp is actually the worst here.
As a user of software that doesn't care how it's built, sure. But system package managers are not a solution for general development with C++, or any other language.
If I want to use C or C++ to create software, how do I use libraries that aren't available in a system package manager? What if I need a version of a library that's not available in my system package manager? There are answers here but they aren't good answers (build from source, using whichever of N build tools the project happens to use, or hope there are prebuilt libs hosted somewhere)
Relying on system package managers to contain dependent libraries makes cross-platform development a complete PITA (more that it already is). Now you need the specific versions of all your libraries in package managers on all platforms, which is a complete non-solution for real development.
The problem is more or less solved - see Nix.
It'll take some decades for the ideas to percolate, but language-specific package managers are definitely not the future.
Exactly.
Also, i'm terrified of this idea of "library-manager download code from internet and run on this machine", without all the tests and QA of individual dependencies like we have in Linux packages.
Also, i've seen so many times people adding dependencies to projects because they did not know the standard library already had what they needed. I get it, it is easier to "pip install foo" than to look for "foo" in the docs. I don't think any sane person can learn everything that is available in the standard library, but searching the docs is always insightful.
The problem is that system-specific package managers are an obstacle to making portable programs.
Even within Linux and BSD there are many flavours of package managers with slightly different naming schemes for their packages.
This fragmentation makes it impossible to have dependencies that just work. You need to either make users install things manually or every author has to probe multiple package names/locations using multiple tools.
Language-specific managers support all of the OSes and just work, especially for users of macOS and Windows (telling people their OS sucks may state a true fact, but doesn't solve portability problems)
The Linux model of package management doesn't work for newer languages. In particular it is heavily reliant on dynamic linking, which tends not to work when you have (a) an unstable ABI (b) generics (c) a culture of static linking.
It works fine, you just ship the static libraries. With static linking your binaries won't have dependencies anyway.
That's not to say the static linking craze is a good thing. We'd be far better off finding a way to dynamically link templates, so you get the security benefits of automatically updated dependencies that dynamic linking gives you.
Some years ago I would have thought all this would be really cool. But who are they kidding? What sort of people will be able to keep this whole language in their head.
C++ books were thick bricks already 20 years ago, and students struggled hard to learn it. Now the language is like 3x as complex. Students are going to need a separate bag just for their C++ material.
Sure you can write in a subset of C++ that is easy to get. But when did that ever work? Who has worked in a company and seen people able to stick to a minimal C++ subset?
No, people get tempted and they start using all the new stuff. Short term it is a real gain. But once you hire junior developer who has to read this code, they suddenly have 3x as many concepts to learn and understand.
I predict a serious recruitment problem with C++ down the road. Old timers today will start using all the new features. When management start trying to add new team members they start realizing that it is really hard to get quality C++ developers.
Anyway who tries Go, Rust, Swift, Nim, D or some other moder/semi modern language are going to ask themselves why on Earth they would want to torture themselves with C++.
It is easy to know why the world's highest-paid programmers, coding for the world's most demanding applications, use C++ and nothing but C++: nothing else is even trying to be useful in those applications.
C++ has sharp edges and pitfalls to stay clear of, so users ... do stay clear of them.
A usable, better language would gain users. But nothing is even on the horizon.
Rust is closest, but its designers have consciously chosen not to support the most powerful of C++ features, to try to keep the language more approachable. Yet, Rust complexity is already beginning to rival C++. Some of that complexity is in how to work around the language's deliberate limitations. As Rust matures it will suffer from unfortunate early choices in precisely the way C++ has, and will only get more complex.
Every choice in the C++ design has been to provide better ability to capture semantics in libraries, so that independent libraries integrate cleanly with each other and the core language. People can use libraries with confidence that they are giving up no performance vs. open-coding the same feature.
Access to the most powerful libraries depends on language features no other language implements. Thus, the best libraries will only ever be callable from C++ programs. With (literally!) billions of lines of code in production use, abandoning interoperability is not a choice to take lightly.
When you start a big project, you never know what it may come to need. If your language "won't go there", your program won't, either, and you will be stuck with unpleasant choices. This is the concept of a language's "dynamic range", a more meaningful measure than "high" or "low" alone: how high can it reach, how low can it reach, how far can it reach, at once? C++ is king of dynamic range. Nothing else comes close, or is really even trying.
> It is easy to know why the world's highest-paid programmers, coding for the world's most demanding applications, use C++ and nothing but C++: nothing else is even trying to be useful in those applications.
There's no proof of this. The world's highest-paid programmers tend to work for FAANGs and a few other categories of businesses, and they might or might not work in C++, and they tend to move up the ranks by being able to scale humans (other devs), not raw tech.
It's a myth that being an über-geek is well paying, by the way.
I will be sure to pass that fact along to all the well-paid über-geeks I know (who will be quite surprised at their misapprehension).
But there is no necessary relationship between "the world's highest-paid", and your notion of "well paying". You could be simply wrong, or your measure of "well paying" could exceed what the actual "highest-paid programmers" cited get.
Dan Luu did a good essay about programmer compensation a few years back.
Could you provide some examples of C++ features that the Rust team has consciously chosen to not support, to try to keep the language more approachable?
Could you show some examples of how you need to work around these deliberate limitations?
Operator overloading. Standard library user-provided allocators. Move constructors. Inheritance. Certain kinds of specialization. SFINAE. Somebody who knows Rust better, and C++, will be able to supply a longer list.
There is a corresponding list of features C++ doesn't have yet, and others it is precluded from having. That programmed move-constructors can fail sucks. Thst moved-from object's still exist sucks.
Providing examples here would be more work than I am prepared for just now. (I am not happy to say so.(
There are good alternatives to C++: Rust and D. There is a number of languages with not quite as high but decent performance and varying expression power: Java, OCaml, Go, even Fortran for numerical stuff (not a joke; modern Fortran is quite advanced, and most likely runs faster then C++).
I see rather few reasons to start a new project in C++ in 2021, even though in some niches nothing else is viable, sadly.
This one-page format using "concept" -> "example" -> "reasoning" is fantastic for people like me who used C++ a lot in the past, and haven't touched it* in decades but still want to keep up to date.
It probably helps that the author understands this enough to ELI5. So Thanks Oleksandrikvl whoever you are.
* And by "touched it" I mean used its deeper features, not just STL containers and simple classes (and for/auto). (I still use it for TFLiteMicro, but generally I see that most users are topical C++ programmers, like me.)
> This one-page format using "concept" -> "example" -> "reasoning" is fantastic
I agree, but I don't think that's happening here. It's documenting C++ "Concept" which is the technical name for a certain part of the C++ language.
It's a great article though.
I was really looking forward to concepts.
But the actual implementation seems like a syntactic and ( partially) semantic mess to me.
Obscure syntax (`requires requires`), soooo many different ways to specify things, mangling together with `auto`, mixing of function signature and type property requirements (`&& sizeof(T) == 4`), etc etc.
This reeks of design by committee without a coherent vision, and blows way past the complexity budget I would have expected to be spent.
Rust (traits), Haskell (type classes) and even the Nim/D metaprogramming capabilities seem simple and elegant in comparison.
The original C++0x concept proposal had proper type signatures and was based, I think, on more traditional type theory. But it had to be continually tweaked as it did not work well in practice so it grew in complexity a lot. Additionally the only implementation was extremely slow to compile.
It was taken out of the standard, and the new version (aka concept-lite) is actually much simpler, although expression based. We lost the ability to type check template definitions though.
Far from being a design by committee, I think for the most part is the brainchild of a single author. The 'auto' thing is definitely a committee addition as many vetoed "implicit" templates and requiring auto after the concept name in the shorthand form was the compromise that pleased no one [1].
[1]: this is an obvious manifestation of Stroustrup's Rule
I haven't been following C++ for quite a while but when I did, I wanted modules. And now it looks like they're here and they've done it wrong. Or at least missed an opportunity to do it really right.
They've done the equivalent of * imports in languages like Java and Python. And style guides in those languages universally recommend against doing that.
Why? With named imports, if you see a symbol anywhere in the codebase, its declaration is somewhere within the file itself. If you see a call to foo(), it's going to be either a local function or a declared import. With C++ modules (as with C++ includes) it could come from any of the imports, so you have to look outside of the file to figure out where it came from.
Sure, IDEs help paper this over somewhat. But it just seems sloppy for a post-1980s language feature to throw all imports into the global namespace.
That's because modules and namespaces are the same thing in languages like Python, whereas they are separated in C++. The code in the imported module will go into whatever namespace it is in within that module, not the global namespace.
I had forgotten about C++ namespaces. It's been quite a while.
I'm not sure that this addresses my concern though. Do namespaces enable the import of specific symbols from a module?
I found the cppcon video on c++ 20 features to be very informative and I am honestly excited to use ranges and other items mentioned, unlike the other guys on here who hate progress.
Get a daily email with the the top stories from Hacker News. No spam, unsubscribe at any time.