That's rough.
I'm glad I got my degree in the previous decade, before orange julius took over and ruined everything.
My blog: https://rldane.space/
My scripts: https://codeberg.org/rldane/scripts
#Fediversian since late 2018, full-time #fediholic since early 2022.
First-wave #twexodee.
NOTE: Most of my toots are in markdown (rich text). Your client might mangle the formatting. Public statuses can be best viewed on on this instance's web interface. Your client should give you an option to copy the post's link.
Imported profile from fosstodon:
Involuntary time-traveler, recipient of offensive grace. Quasi-technical Linux and FOSS enthusiast. Armchair privacy advocate
Profile pic is my own, copyright me.
Header image courtesy of NASA: https://unsplash.com/photos/Q1p7bh3SHj8
My #interests:
#StarWars
#StarTrek
#Linux
#UNIX
#Bible
#Christianity
#Jesus
#AmateurRadio
#Bash
#Dallas
#Writing
#Poetry
#Space
#KSP
#Tea
#FountainPens
#Journaling
#TabletopRPG
#RetroComputing
#ClassicMac
#uxn
That's rough.
I'm glad I got my degree in the previous decade, before orange julius took over and ruined everything.
#subtoot, but I just read a well-intentioned toot from a FOSS technical brand account that said,
Education in 2025 is all about leveraging the right digital tools to enhance learning and teaching experiences.
And that just makes me so very sad.
Like... get a freaking book, the more philosophical and weird the better, go make a little picnic, read your book, and think big, complex thoughts about life.
THAT is PEDAGOGY.
Not using the latest technical geegaw that will be utterly forgotten in five years.
* kicks dirt
* yells at cloud
#I_wanted_to_be_a_professor_but_the_academy_died_while_I_was_still_in_it
#Underrated #KDE #Plasma feature: #Activities
Folks get these confused with virtual desktops, but think of it more like a totally separate environment for different tasks/projects.
You may use any number of virtual desktops to help organize the various subtasks or applications within a given job, but let's say you wanted a whole 'nother set of desktops for a different tasks.
Let's say you do your main work in your main activity, and you've got 6 or more desktops all firing on all eight cylinders with all kinds of terminal windows, browser windows, whatever. But let's say you've set aside a day for training, and you want to have a totally clean environment just for training/classes, without closing out all of the other stuff you have going on in your work Activity, because you want to jump right back into it tomorrow after training's done.
No worries, just go into System Settings, Apps & Windows -> Activities, and create a new activity. You can name it and give it its own icon, etc.
I use Super+~ to switch between activities — I confess I forgot what the default one is, but you can find it under System Settings, Input & Output -> Keyboard -> Shortcuts and either look it up there or change it to whatever you like.
Am I the last person in the world that thinks that if you have more than 10 tabs open from day-to-day you're doing something incorrectly? XD
Thinking of trying #postmarketOS on my #PinebookPro.
I didn't even know you could run it as a desktop OS.
Supposedly the installer supports #FullDiskEncryption, which is... "poggers," I think the kids say.
I think we're more sensitive to flicker than anything else. Not sure why.
I think persistence of vision (e.g. flicker) is different from the actual perception of vision.
I think a lot of this is because each frame is very discrete, whereas in real life, the "frames" are blended.
Like if they recorded 24fps movies by taking 96Hz video and only taking every four frames and showing that onscreen, it would be way more annoying than if the film was recorded at 24fps, or if all four frames were blended to combine 96 down to 24Hz.
So what we're noticing is the jumps in the motion @ 60Hz, not so much the actual visual difference between the image being updated 60 vs 90 times per second.
Does that make sense?
If you took an animation sequence, rendered it at 240Hz, and blended those frames down to 60Hz, I'd wager that you might not be able to tell the difference between true 240Hz and 240 smoothly blended down to 60Hz.
I had heard that the human brain only "refreshes" the image at about 40Hz, but that was from a 1990s miniseries on the brain. There may be better data now.
Also, people tend to be sensitive to flicker at higher rates than the brain's main "refresh rate."