On the other hand I was not so lucky using C. I tried a very simple C program and compiled it with #llvm-mos mos-mega65-clang and it didn't work as expected. I recompiled it using the very same compiler with mos-c64-clang and it works like a charm on the GO C64 mode. I’m not sure yet whether it’s something related to the emulator or the compiler.
All I want is just a collection of #binutils, #GCC, #llvm+#clang, #glibc and #musl that are "free standing" / relocatable, which I can pack into a #squashfs image to carry around to my various development machines.
You'd think that for something as fundamental as compiler infrastructure with over 60 years of knowledge, the whole bootstrapping and bringup process would have been super streamlined, or at least mostly pain free by now.
Yeah, about that. IYKYK
LLVM has got some competition.
TPDE Compiler Back-End Framework
https://arxiv.org/abs/2505.22610
"TPDE-LLVM: a standalone back-end for LLVM-IR, which compiles 10--20x faster than LLVM -O0 with similar code quality, usable as library (e.g., for JIT), as tool (tpde-llc), and integrated in Clang/Flang (with a patch)."
Holy cow!
Open Source on GitHub:
https://github.com/tpde2/tpde
We are discussing a moonshot project in the #LLVM community now: the limitations of computing KnownBits on-demand without caching, cutting off at a recursion-limit, are very frustrating. Considering investing in a large project to fix this once and for all. Exciting times!
https://discourse.llvm.org/t/rfc-computeknownbits-recursion-depth/85962/10
Can anyone familiar with building LLVM/clang to target an embedded platform tell me what I’m doing wrong here, or whether my expectations are wrong? I can build clang/lld/lldb but if I add in compiler-rt the build fails for reasons I consider mysterious. https://discourse.llvm.org/t/difficulty-building-clang-from-20-1-6-for-armv8-unknown-unknown-eabihf-elf/86633 #llvm
Linux 6.16 will need GCC 8 and Binutils 2.30 to build.
#Linux #Kernel #LinuxKernel #Computers #Laptops #TechNews #TechUpdates #GCC #Clang #LLVM #Binutils
https://officialaptivi.wordpress.com/2025/06/01/linux-6-16-needs-gcc-8-and-binutils-2-30/
Perfectly normal post on the LLVM forum
#EuroLLVM is a good opportunity to talk of the #LLVM community. No, not at the conference.
Because if you ever were wondering what LLVM project's attitude towards its volunteer contributors is, just look at the ticket prices. I mean, which volunteer would spend $750 on a conference ticket?!
But yeah, we know our place. It's to spend weekends fixing what corporate contributors broke through the week, then beg them to actually review our fixes before they break more. And in the meantime, our gracious lords will debate how to mess up our future even more.
2025 and clang-format still can't enforce a space before function calls https://releases.llvm.org/20.1.0/tools/clang/docs/ClangFormatStyleOptions.html #llvm #clang #format
One of the reasons I'm still using GitHub for a lot of stuff is the free CI, but I hadn't really realised how little that actually costs. For #CHERIoT #LLVM, we're using Cirrus-CI with a 'bring your own cloud subscription' thing. We set up ccache backed by a cloud storage thing, so incremental builds are fast. The bill for last month? £0.31.
We'll probably pay more as we hire more developers, but I doubt it will cost more than £10/month even with an active team and external contributors. Each CI run costs almost a rounding-error amount, and that's doing a clean (+ ccache) build of LLVM and running the test suite. We're using Google's Arm instances, which have amazingly good price:performance (much better than the x86 ones) for all CI, and just building the x86-64 releases on x86-64 hardware (we do x86-64 and AArch64 builds to pull into our dev container).
For personal stuff, I doubt the CI that I use costs more than £0.10/month at this kind of price. There's a real market for a cloud provider that focuses on scaling down more than on scaling up and made it easy to deploy this kind of thing (we spent far more money on the developer time to figure out the nightmare GCE web interface than we've spent on the compute. It's almost as bad as Azure and seems to be designed by the same set of creatures who have never actually met a human).