

sounds like you should transition to being an architect


sounds like you should transition to being an architect
it’s melted snowflakes do try to keep up
not tried gemma yet, i’ve stayed away from google stuff. maybe i’ll give it a shot.
yeah one of those framework machines with 128GB shared ram would have been amazing. shame they’re sending money to racists.
one of my most recent fun activities came from discovering the “allow editing” button in koboldcpp. since the model is fed the entire conversation so far as its only context, and doesn’t save data between iterations, you can basically re-write its memory on the fly. i knew this before but i’d never though to do it until there was an easy ui option for it, and it turned out to be a lot of fun, because when using a “thinking” model like qwen3.5 you can convince it that it’s bypassing its own censorship.
basically you give the model a prompt to work off of, pause it in the middle of the thinking process, change previous thoughts to something it’s been trained to filter out (like sex or violence or opinions critical of the ccp), and it will start second-guessing itself. sometimes it gets stuck in a loop, sometimes it overcomes the contradiction (at which point you can jump in again and tweak its memory some more) and sometimes it gets tied up in knots trying to prove a negative.
a previous experiment was about feeding stable diffusion images back into itself to see what happens. i was inspired by a talk at 37c3 where they demonstrated model collapse by repeatedly trying to generate the same image as they put in (i think this was how sora worked).


i will never understand that attitude. four hours of exploration, learning and puzzle-solving sounds like the best part of the job. an isolated, well-specified problem that can be completed in a day is like the most fun you can have programming. why swap that for an hour of code review?
i did my first machine learning course more than 10 years ago, so i’m not ashamed to admit that i bought beefier hardware to play around with local models in early 2023. i still like doing that. mostly because i know my gpu is powered entirely off of fossil-free energy and because i decided early on not to spew the output all over the internet unless it was poignant. or funny. not as in “the llm told a good joke”, more as in “i compressed this poor thing to fit on a cd and now it can only talk about dolphins”.
qwen3.5-12B really screams along on a 7900xtx. like, up to 70-100 tokens a second. perfect for seeing the results of your torture methods quickly.


no, robert deniro is not “in heat”
hi beep.
Edit: just so you know, by editing the image without disclosing it you’re violating the CC-BY-SA 4.0 license. so… quit that shit.
“this isn’t a beach, this is a bathtub!”
aaaaaand quit


is germany increasing or decreasing the amount of coal it burns every year?
right now, they’re ramping up massively.
they have a blog post on this
that’s nice, if a bit loose. i hope it’s a working solution.
you are out of the game too long, they switched to google a while ago
that link is broken for me, but from their privacy policy it seems they use both. not that google is any better, unfortunately.
For the first time in its 16-year history, Ecosia users in France will now receive a proportion of their search results directly from EUSP’s independent European index.
that’s very good to hear. hopefully it rolls out to more than just france.
Try to stay positive eh ;)
thanks for the reminder. genuinely.


think of it like this. you build small tools for internal use, so code quality, maintainability and documentation are not your highest priorities, right? most important is to ship the features needed by your colleagues right now. i’ve been in that same boat, building internal testing tools for a big multinational.
say your tool contains shortcuts for interacting with some internal database. you don’t need auth because it’s all on the internal net, so it’s just a collection of shortcuts. you don’t really care about maintainability, because it’s all just temporary, so you throw every new request together in the fastest way you can, probably trying out new techniques to keep yourself entertained. you don’t really care about testability, because you’re the only one testing so you can check that everything works before doing a release and if something slips through one of your colleagues will walk down the hall and tell you.
now imagine it gets enough attention that your boss says “we want our customers to use this”. suddenly your priorities are upended: you absolutely need auth, you definitely need testability and you absolutely definitely need the tool to not mess things up. best possible world, you can reimplement the tool in a more manageable way, making sure every interface is consistent, documentation is up to snuff for users, and error handling is centralised.
the claude cli leak is the opposite of that. it’s the worst code quality i’ve ever seen. it’s full of errors, repeated code, and overzealous exception handling. it is absolutely unmaintainable. the functionality for figuring out what type a file is and reading it into a proper object is 38 000 lines of typescript, excluding the class definitions. the entire thing is half a million lines. the code for uploading a pdf calls the code to upload jpegs, because if a file isn’t identified it’s automatically a jpeg. and jpegs can go through the same compression routine up to 22 times before it tries to upload them because the handler just calls itself repeatedly.
and this is the code they thought was robust enough to withstand the internet. imagine what their internal tooling looks like.


ew :(


so fun fact, i live within walking distance of that dc project (close enough that i was invited to attend a presentation by edc on what it would do to the immediate area) and i’ve done the math on the grid load. the hydro plants in the city currently meets about 40% of its needs. edc wants 750MW of reserve power generation (that’s diesel) installed on the site. if we assume that’s double their average usage, they will still more than double the energy needs of the city. and that’s in the middle of an infrastructure crisis that means we’re stuck buying german coal.
ecosia has always felt kinda skeevy to me, because tree planting isn’t actually carbon positive. it becomes positive after 40 or so years, granted that the trees are not cut down. which is usually what happens. also, just like ddg, ecosia is dependent on bing, the search provider with the worst carbon footprint, to function.
and i probably don’t need to tell you what cobalt mining does to the environment or the people who handle it.
so it does help to look deeper at the things in your daily life :)


what’s everyone else doing then? isn’t it specifically for code?


after the client leak i should think it was even more poignant. a bit like tearing down a wall and finding it full of mouldy razor blades


i’ve not actually looked at the demographics. is that available somewhere?


a google ai search, sure. a normal web search uses several orders of magnitude less energy.


sure let’s just burn down half a hectare of the amazon for a seven line script.
engineering philosophy is where the rubber hits the road. building software with the right philosophy can mean the difference between needing a datacenter or a raspberry pi for the same job. it directly translates to money saved in recurring costs.
…you okay man?