6/6
How to Nominate
Deadline: end of 31 May, 2025, AoE
To nominate a paper, please fill in this form: https://forms.gle/V4eYuGKGSHjokpBw7
A nomination may come from anyone. Posthumous awards will be considered. All papers can be found online on the Springer webpage: https://link.springer.com/journal/10601/volumes-and-issues
Help us recognise and celebrate the research that has shaped our field, and submit your nomination today!
#BestPaperAward
#ConstraintProgramming
#AI
#CallForNominations
#Optimization
#AcademicMastodon
5/6
Evaluation Criteria
Factors influencing the decision of the award include:
- Did the paper start a significant new line of research?
- Has the paper made a major theoretical advance?
- Has it heavily influenced other researchers (whether in or outside CP?)
- Has the paper influenced applications?
#BestPaperAward
#ConstraintProgramming
#AI
#CallForNominations
#Optimization
#SpringerPublishing
#AcademicMastodon
3/6
 Prominent Paper Award
This award honours papers that are exceptional in both their significance and impact on the field of Constraint Programming, and that were published in Constraints between 2018 and 2024 (inclusive).
The publication year refers to the publication year of the issue in which the paper appeared.
#BestPaperAward
#ConstraintProgramming
#AI
#CallForNominations
#Optimization
#SpringerPublishing
#AcademicMastodon
2/6
The Constraints journal is seeking nominations for two awards that celebrate outstanding contributions to the field of constraint programming. Please fill in this form by the end of 31 May 2025 AoE to nominate a candidate for the following awards (one nomination per form): https://forms.gle/V4eYuGKGSHjokpBw7
#BestPaperAward
#ConstraintProgramming
#AI
#CallForNominations
#Optimization
#SpringerPublishing
#AcademicMastodon
1/6
Don't forget to submit your nominations for the Prominent and Classic Paper Awards!
Nomination deadline: 31 May 2025, AoE.
More info in thread
#BestPaperAward
#ConstraintProgramming
#AI
#ACP
#ArtificialIntelligence
#AssociationForConstraintProgramming
#CallForNominations
#CP
#Optimization
#OutstandingPaperAward
#Awards
#springer
#SpringerPublishing
#AcademicMastodon
#AcademicExcellence
#AcademicChatter
#ComputerScience
From: blenderdumbass . org
Working to make a game which is very hard to optimize a bit more respectable when it comes to performance. Which is easier said than done. This video is a journey of pain that is optimization in UPBGE ( the game engine chosen for Dani's Race ).
Featuring a new soundtracks for Dani's Race called "Light ...
I just finished a #video that took me 10 days to edit and a month to do overall. I had 60 hours of footage.
It is about #Gamedev #Optimization in #UPBGE
Embark on a timeless adventure! Cyan's 'Myst' trilogy is now optimized for Apple Silicon, bringing enhanced performance and immersive gameplay to Mac users. #Adventure #Myst #Apple #Silicon #Mac #Technology #Gaming #Performance #Immersive #Cyan #Optimization #Trilogy #Support #Revitalization #Native More info at: https://thedailytechfeed.com/cyan-revitalizes-myst-trilogy-with-native-apple-silicon-support/
Mercredi 4 Juiln, nous aurons le plaisir de recevoir VictorGallet de @bpifrance@x.com pour nous parler Async Profiler: optimiser vos applications #flamegraph #optimization https://www.alpesjug.fr/?p=3566
Advice to Finland: Speak English at work, take fewer exams, and end home care support
The OECD's latest country report points to a brighter economic future for Finland, but only if it optimises the use of public funds.
No matter how good your #tech is, #developers will always be garbage, and catch up to your #phone or #desktop as they don't believe in #optimization and just go:
HEY LOOK, STUFF HAS MORE MEMORY AND PROCESSING POWER, LET'S CRAM A WHOLE BUNCH OF EXTRA SHIT IN NOBODY ACTUALLY NEEDS AND MAKE IT LOOK PRETTY IN THE MOST INEFFECTUAL WAY!!!
Some do, and you folks are doing GOD'S WORK.
Even with 4 gigs of ram on my phone and 8 cores, I still have #apps freeze everything up.
My very-very-WIP Rust compiler currently has one mode of operation, where it checks the Cargo projects in all the directories specified on the command-line. The first step, for every project, is to find and load the manifest file. When you're only working with one project (even if it's a workspace), nothing can happen until that manifest is loaded.
I'm currently using the cargo_toml
crate to parse the manifest, but I've noticed that it's a bit too slow for my use case. Opening the manifest file takes ~4.5µs, and reading its contents into memory takes ~4.8µs. But parsing the file -- an operation that needs no I/O -- takes a whopping 174µs! That's about 10 times longer than I expected.
The latency is probably due to the number of allocations performed, overhead due to serde
, or inefficiency in the parser itself. I'm going to try writing a simple streaming TOML parser to make do instead, with a target latency of 5µs. Let's see whether that's achievable.
A primer on Maximum Power Point Tracking solar charge controllers. This device goes between the solar panel and the battery and — not to put too fine a point on it — controls the charging of the battery to optimize power transfer from the solar panel (and, hopefully, maximize battery life I would assume).
I saw one mentioned in a YouTube video, likely sponsored, but if you have one that you are happy with, feel free to recommend.
https://www.electricaltechnology.org/2021/07/mppt-solar-charge-controller.html
Well… everything I have a keybind for in Qtile, and Qtile itself for that matter, has its own thread now.
Also, I am addicted to Rust now for the same reason.
I can almost guarantee you that my setup is way more responsive than anything you've ever seen on r/UnixPorn, and there's still absolutely no reason why I couldn't rice the hell out of it if I wanted to, I guess. I'm not going to, but I sure could, lol.
Seriously, though, this is why I use a tiling window manager: not because they look cool, but because they allow you to do weird things that save, by my estimate, objectively enough clock cycles that it actually feels faster subjectively. The things that fascinate me the most about Unix based operating systems is a mindset that is, for the most part, older than I am.
Or I would love to be wrong about that last part.
When you look at old-school Unix stuff like Grep, it was faster than it had any right to be, for the simple reason that it literally had to be. The hardware they were working with back then was capable of practically nothing by any standard that is even somewhat modern. I feel like no matter how good our hardware gets, the day that mentality becomes lost to history, is the day beyond which any hardware that could ever exist even in theory will never be enough.
3/6
** Prominent Paper Award**
This award honours papers that are exceptional in both their significance and impact on the field of Constraint Programming, and that were published in *Constraints* **between** **2018** **and** **2024** (inclusive).
The publication year refers to the publication year of the issue in which the paper appeared.
#BestPaperAward
#ConstraintProgramming
#AI
#CallForNominations
#Optimization
#SpringerPublishing
#AcademicMastodon
6/6
**How to Nominate**
**Deadline: end of 31 May, 2025, AoE**
To nominate a paper, please fill in this form: https://forms.gle/V4eYuGKGSHjokpBw7
A nomination may come from anyone. Posthumous awards will be considered. All papers can be found online on the Springer webpage: https://link.springer.com/journal/10601/volumes-and-issues
Help us recognise and celebrate the research that has shaped our field, and submit your nomination today!
#BestPaperAward
#ConstraintProgramming
#AI
#CallForNominations
#Optimization
#AcademicMastodon
5/6
**Evaluation Criteria**
Factors influencing the decision of the award include:
- Did the paper start a significant new line of research?
- Has the paper made a major theoretical advance?
- Has it heavily influenced other researchers (whether in or outside CP?)
- Has the paper influenced applications?
#BestPaperAward
#ConstraintProgramming
#AI
#CallForNominations
#Optimization
#SpringerPublishing
#AcademicMastodon