It is the 1746th of March 2020 (aka the 10th of December 2024)
You are 18.97.9.171,
pleased to meet you!
mailto:blog-at-heyrick-dot-eu
Dark mode
My blog should now respect your device's theme setting, if it's the traditional light mode, or the more recent dark mode.
This was achieved by hacking the CSS to fiddle the colours as necessary, and fiddling with the buttons and other graphics in Paint (yes, Paint under RISC OS can create PNGs with transparency now!). There's nothing fancy like alpha blending, I've just masked the "background colour" so it doesn't show. I had to recreate the "Navi:" text (mobile mode) because black on black doesn't work. It's now sort-of-cyan, picked as a colour that ought to be visible on both white and black.
It isn't perfect. For that I'd need to either use complicated alpha blending and/or recreate the graphics to depend upon the theme. That's a lot of work. So, sorry, it'll look suboptimal. That being said, the light blue around the country does, actually, help it to stand out more. I think the only real problem is the "Navi" text in mobile mode. Maybe I'll just get rid of it?
This is how my blog looks with Android's Firefox in desktop mode.
Firefox, desktop, dark.
And here is how my blog looks with Android's Chrome in the default mobile mode.
Chrome, mobile, dark.
How to check/set the dark mode options
Look in your browser's settings for "Theme" or "Customisation" or something like that. You ought to see three options:
Use light mode
Use dark mode
Use device theme
This allows you to set up light mode (always), dark mode (always), or to switch according to what the device is currently set to use (which can be a forced toggle or change according to time of day). Here I am talking about Android as my phones are Android, but I would expect Apple devices to be broadly similar.
How to override the dark mode choice
In short, you can't. My blog now asks the browser to respect your chosen settings. Either your browser has been set to use dark mode, or your device is using dark mode and the browser is obeying that. To toggle light/dark mode, you'll need to alter your device or browser settings... on the basis that it's doing this exactly because your device asked for it.
Advantages of dark mode
Less eye strain for starters. Back In The Day, it was common/normal to have white (or green, or amber) text on a black screen. I am writing this text in Zap with white text on black.
It wasn't until the desktop metaphor appeared that people started using windows with white backgrounds and black text as it resembled a piece of paper (think of Microsoft Word or any desktop publisher package).
This was copied in the browser as it's what people expected a page to look like.
However it is hard on the eyes, not particularly suitable for long term use or use in the evening, and on modern mobile devices it can directly affect battery consumption. If the device has an (AM)OLED display, then each pixel is creating light itself (rather than the older method of selectively blocking light from a backlight panel). This means that the more pixels that are lit up, the higher the consumption. If you are lucky your display might have white pixels, otherwise white is created by lighting up all three of the red, green, and blue pixels.
It's remarkably hard to calculate as the red, green, and blue LEDs not only use different amounts of power for the same brightness (green uses the least, blue the most), but they also run at different brightnesses (because of the sensitivity of our eyes), and as if that wasn't enough, the overall brightness is not linear. Stepping from 70% to 100% might use as much as 80% more power for only a 30% increase in brightness.
So exact figures are complicated and will depend upon your display panel, but one thing that we can say with ease is that a page with a white background could easily consume 2× to 4× more power than the same content on a black background.
(plus, OLED, being organic, doesn't appreciate being run hard, screen burn is a thing)
Anyway, I hope this makes my blog more compatible with late night reading if you fancy reading random rubbish because you can't sleep.
I can empathise. I don't think I've had a good night of sleep in ages... that's just how it is for some of us.
At the stroke of midnight...
Long exposure photo at midnight.
It was rather chillier than before. About 7°C. Which, granted, isn't unusual for the 1st of January, but given it's been twice that in the day...
It's traditional to eat something, so I did.
Beans on toast.
Proper beans (Heinz, in this case). Not those gloopy haricots-en-sauce-tomate that one normally sees over here, that are basically "beans" by somebody who has only read a description (the sauce is like that naff orange stuff that you get on cheap pizzas).
Why beans instead of something real? Well... there's a story to that.
Earlier in the day I wanted to arrange my linguine in a sealed glass jar, but the pieces of linguine were longer than the height of the jar.
No problem, I thought. I'll just snap off the inch or so at the end until the linguine fits.
And I did.
For three packs.
Linguine.
Remaining on a paper plate was an unexpectedly large pile of linguine bits. Which I felt obligated to cook. I mean, it was just sitting there so...
...I think I must have made my way through over a kilogram of it at the end. I guess it's just as well that buttered linguine with a sprinkle of pepper is my "comfort food".
Speaking of comfort food, here's something I made to celebrate the arbitrary changing of the number on the wall (that we need to bodge slightly every four years).
Let there be cake!
A nice fluffy cake dusted with icing sugar, with about half a jar of black cherry jam between the layers.
Rain
I had to hurry to feed Anna, since it was chucking it down. It's supposed to go on like this all night, the département is on yellow (lowest) alert for flooding. And, later, wind.
Yup. Here comes another storm. It's the first day of the year and I've already lost track of how many storms have come through recently.
But... actual tornados in Manchester. I didn't even know that was a thing.
Are you a 10 if you have Android?
For your amusement, filed under the Science/Technology part of the Daily Mail is the article Can a man still be a '10' if he has an Android?.
Yes, people, this is what passes for "science" in the Fail.
Anyway, some random TikTok person went around asking women if people with an Android phone is a deal breaker, with some of the women saying that a guy choosing Android would be downgraded to a '3' or perhaps even a '0' because - get this - green text bubbles and bad quality photos. One woman even said that the photos wouldn't be compatible.
One woman said "Are you broke? How can you not afford an iPhone?".
I'm not sure I'd want to even attempt to go out on a date with a person who would be so vapid as to want to rank me on an "out of ten" scale, and that's not because I'd score low, it's more that it's a bit of a crappy thing to do. Either you're interested in somebody or you aren't.
And, no, there is no man (or woman) alive that would be a ten. I'd expect somebody who is a ten to be a supernatural deity, and dating one of those is going to be a headache. There's plenty of mythology about what happens when humans try to go out with gods...
For my part, I wouldn't downgrade anybody based upon their phone. I prefer Android as I want to run the apps I want (like different browsers with different rendering engines), have access to the filesystem, and generally be able to interact easily with my other devices. In my time of owning an iPad (iOS6/iOS7), what stood out most for me was how it was incompatible by design with everything else and how completely locked down it was. Therefore, I think it's fair to say that if you like the freedom of using your device the way you want, you'd probably choose Android, and if you like the training wheels and safety net then maybe Apple is more your thing? Or, maybe you're just a sheep and feel obliged to have Apple because your peer group does?
The comment about photos not being compatible says a lot about the technical competence of the people this guy was talking to, such that they probably wouldn't even understand my reasons for preferring Android.
Oh, and as for the women pictured in the article? All quite firmly in the "meh" category, going by appearances alone, in my opinion.
Are we living in a simulation?
Still in the Fail's science/tech section, Are we living in a simulation? is about a physics professor (Melvin Vopson, University of Portsmouth) who thinks he has a way of proving that we are just characters in an advanced virtual world.
This is, by the way, a rehash of the exact same story that they published on the 10th of October. Repeating something in three and a half months? What, no Meghan stories?
Let's see... Limits to how fast light travels correspond to the speed of the computer processor (running the simulation), the laws of physics are akin to "computer code", symmetry saves on computational power, and the elementary particles are akin to pixels. Plus some gumph about information potentially having mass (and being the fifth form of matter in the universe).
Oh, and that equations, numbers, and geometric patterns are visible all over. Nature is full of them.
This article has been repeated in news sources all over the place, not just the Mail, and, yes, many of them claim that he can "prove" it.
This is, essentially, a massive pile of pseudo-scientific bullshit.
The first question I would ask is: What is the point of this simulation?
If it is to keep humans believing they're living normal lives while they're acting a biological batteries (like in The Matrix), then comes the second question: Why the suffering?
Ukraine and Russia are at war, many are injured and dying.
Rinse and repeat for Palestine and Israel.
In many places of the world, the female of our species gets a pretty raw deal either being the possession of a man, or simply murdered as a baby because their parent's sky fairy tells them that girls are bad.
Speaking of women's rights, it's slipping in reverse in the US, and a second round of Trump might well tip the country into anarchy.
Why? Why all of this, rather than a nice Rob Ross world with happy trees?
If our consciousness can be fooled to thinking it's real and alive when it is a simulation, why not just pair up two people, make them about ten years old, and give them an island upon which to camp and have "an adventure". Every 'night' wipe their memories and they can wake up and repeat the day...forever. Oh, and find some "excuse" for why they can't leave the island (leaky boat, can't swim, etc). In that way, everybody can have their own personal island with a friend for company, and nothing needs to be thought about or rendered beyond the horizon (and between the island and the horizon is just water).
We can go further and say if this is a simulation and things are limited (for example, symmetry in things because it's faster and takes less information) then the third question: Why is the observable universe so complicated?
If you could collect snowflakes as they fall and look at them, you'll see that they're all different. It's one of the marvels of the natural processes that throw the water around in the cloud until it falls as snow. But, you know, we only discovered this fairly recently (in human history) when somebody had a microscope and looked at them.
If they were all identical, and that's what we observed, we'd probably remark that it was weird that they're all the same just as we remark that it's weird that they're all so different... but if they were discovered to be all the same, we'd just accept this as how things are.
So why such a ridiculously insignificant detail? If you're trying to conserve processing power, there's no justification for snowflakes not to be copy-pasted from a "snowflake" template.
In fact, why do we even have snowflakes? One could just as easily optimise that when it is cold enough the raindrops fall as ice. So we would have "hail days" instead of "snow days", and places would be clogged with hailstones rather than snow drifts.
Or, you know, just botch the temperatures a little so rain never fell as ice, just as rain. Then you can get rid of the distinctions like snow, sleet, and hail and simply have a few different sized raindrops.
But, we can go further and have all of the raindrops be the same size, it's just the number of them falling at once that changes.
So we've taken our complex water-falls-from-clouds behaviour and we have reduced it to varying amounts of exactly the same thing. While it might seem strange to us, it wouldn't be at all strange if this is how it always was. After all, we are used to the sun being yellow (it isn't), the sky being blue, and grass being green. Because that's how it is in our reality. But, honestly, the sun couldn't just as easily have been reddish, the sky green, and the grass a sort of purple.
Actually the colours are derived from the abundance of oxygen in the atmosphere, which is why the white sun looks yellow, and that in turn is partly why chlorophyll makes plants look green (it absorbs the red and blue from sunlight, reflecting the green). So being oxygen breathing creatures on a planet with an oxygenated atmosphere, the colours of these things were determined by chemical properties.
That being said, if this was a simulation, oxygen could have been associated with Hello Kitty Pink and we, knowing no better, would be like "oh, okay". Mindscrew: Oxygen is Hello Kitty Pink, it's just this simulation is defective and has it as blue. ☺
The speed of light thing is amusing, as this implies that the computational processes that run this so-called simulation resemble those of current computer technology.
Our brains are like a computer. But it isn't a computer that we would recognise or have been able to build. It's a massively parallel processing unit with parts that run between 10 to 100 hertz, which is probably on the order of ten billion times slower than the processor that is showing you these words. It has the ability of self-programming, c'est possible d'apprendre des choses (it's possible to learn things), and it can make inferences that are phenomally complex and that's just "normal".
Think about it. Imagine yourself walking out front, picking up a basketball, and tossing it through a hoop.
You might even have a hoop and a ball to throw through it. While you may suck and miss (or is that just me?), the fact is that all of you could imagine this because at some stage you have walked, you have navigated your house to the front door, you have identified a ball by sight alone, picked it up, and thrown it.
Let's start with something simple. You're standing below the hoop. It's about four metres away. You're holding the ball. Throw it.
Now describe the equations that you did in your head to do that. How did you know what angle to throw up? How about the side to side angle (or if you rotated your body, how did you know the correct orientation?). How much force did you put behind the ball? How did you know? Or if you were just guessing, how did you know how much to guess?
The fact is that it would probably take you the rest of your lifetime to take the act of walking outside and throwing a ball and break it down into pure mathematical equations. But for our brains, it's a fairly easy autonomous action. Sure, there are those who are way better at it, which is why they paid well to do it as a competitive sport; while the rest of us may be rather less impressive we'd at least generally get the ball sort of near the hoop. Very few of us would immediately fall over, crawl outside, and then toss a plant pot through the living room window (unless we're very drunk).
I'm not talking about brains to say "ooh, clever innit?" (yes, it is), I'm talking about them to provide an example of a computational object that we all come equipped with, in order to highlight that whatever advanced person/civilisation that is running this simulation may well have "computers" that are so alien to us that our brains would be more like a home PC than whatever is running the simulation. As such, saying the speed of light is a limitation of the processor power is nonsense even by the standards of this nonsense. Übernonsense, if you like.
Next, let's lay waste to the idea of symmetry. There are many designs in nature that look symmetrical, but aren't. Butterfies and faces, for example.
Go take a picture of your significant other. Have them looking straight at the camera.
Now, in a photo editor, crop their face so you only have the left half (your left, her left, doesn't matter).
Copy that half, then flip it horizontally.
Finally, move the flipped half into the correct place to recreate the entire face.
What do you see? Uncanny valley, right? Because while faces look symmetrical, they aren't exactly.
It's the same with a lot of things in nature. Go look at a budgie or cockatiel, you'll notice that while the colour patterns look the same on each side (far easier to see on a male cockatiel), they are not identical. Or, you know, the fur colours of a cat especially if it's one like Anna that is "white with blotches".
I have wasted enough time on this crap. I'm sure you get the idea that I think us being in a simulation is ridiculous.
Now let me offer some proof of how we are living in a simulation...
First up, no matter what sort of technology is actually running the simulation, calculations are expensive. The more calculations, the more time. Even in a vastly parallel system that can do a billion things at once, it's an immutable fact that having to calculate a hundred things will take longer than just one.
This is why the earth is round. By making the earth round, the horizon will be about five kilometres (three miles). Sure, there are ways to cheat this (a tall building) or other elevated location, but there is still a finite (and not terribly far) limit to that which can be directly observed.
You'll notice if you try to observe a lot further by, say, climbing a mountain, most of the landscape around you will be mostly barren mountain. This is intentional and is designed to reduce the complexity of the simulation around you.
Likewise, if you go up in an aircraft, the higher you go the less the details are visible. If you're looking down from a commercial jet, you are unlikely to be able to see the leaves of trees, and certainly not the blades of grass. They'll have been replaced with some randomly textured green patches in order to reduce complexity.
This curvature of the earth also has another benefit. The light source is from a point away in space, which means that half of the planet experiences night while the other half experiences day.
Or, to put it into different words, half of the simulation is a bright and vividly coloured scene with many details, while the other half is a dark barely-detailled scene (and most of you aren't supposed to see it, you're supposed to be asleep).
In order to simplify the simulation, it only fully runs half at a time, and just keeps changing exactly which half that is.
Your comments:
Please note that while I check this page every so often, I am not able to control what users write; therefore I disclaim all liability for unpleasant and/or infringing and/or defamatory material. Undesired content will be removed as soon as it is noticed. By leaving a comment, you agree not to post material that is illegal or in bad taste, and you should be aware that the time and your IP address are both recorded, should it be necessary to find out who you are. Oh, and don't bother trying to inline HTML. I'm not that stupid! ☺ ADDING COMMENTS DOES NOT WORK IF READING TRANSLATED VERSIONS.
You can now follow comment additions with the comment RSS feed. This is distinct from the b.log RSS feed, so you can subscribe to one or both as you wish.
Gavin Wraith, 2nd January 2024, 13:08
Happy New Year. Good critique. People should realize that theoretical physics, and the applications of maths to the real world, is all about forgetting details. We cannot function without oversimplifying. Two apples plus two more apples makes four apples. We do the same exercise with pears, a light bulb comes on and "gosh, 2 + 2 = 4". I bring to your attention a word, decategorification , which has trended in maths during the last three decades. Quite broadly, decategorification is a rigorously defined procedure for forgetting information and reducing the complexity of a given mathematical structure. But often the forgotten informatiom is useful for comprehension. Think of a square array of boxes. You have 1 box at the top right. Then 3 boxes, forming an L shape, next to it. Then 5 boxes forming a bigger L shape next to it. And so on down, until the whole array has been shelled into L shapes. We conclude that the sum of the first N odd numbers is N squared. Odd numbers are naturally L-shaped with equal arms, because you have a special box where the arms meet and an even number in the two arms. The statement about summing odd numbers does not need the idea of arranging boxes into shapes, but vision is our richest sense, so geometrical ideas can help us understand.
David Pilling, 2nd January 2024, 13:09
I'm not so sure... person A has a 400 quid generic laptop of free software. person B a top of the range MacBook. Why not judge them on it - it says something. Even at a crude level if you can afford Apple you're more likely to be well off.
What are you looking for, someone with money, or someone with good personality traits, imagination, generous. Either way Apple might be the win.
Ah more money than sense, or deep in debt, air head follower of fashion, OK it's not a perfect methodology.
Look at the laptops on the TV, always Apple, never see one sporting my brand (MSI) on the lid.
At some point I got a Nikon camera, and an Apple computer - great disappointment to find my life did not change. Budget models, can't fake it.
If I get a BMW, will I drive more aggressively... or is it aggressive drivers who buy BMW.
jgh, 3rd January 2024, 06:31
On my parish council I keep having arguments with other councillors: "We need iPads" "No you don't, we need some generic off-the shelf, *CHEAP* generic portable tablet computer, not some £2000 shiney, look, here's some proposals at £80 per device...." "We need iPads" "SHUT UP!"
jgh, 3rd January 2024, 06:41
I'm sure there's a Saturday Morning Breakfast Cereal cartoon with something like: Human: God, it's so wonderful that snowflakes are all different. God: Shit! You've invented microscopes? Hell, I left out the snowflake rendering because I thought nobody would ever look close enough.
Rick, 5th August 2024, 16:37
"Look at the laptops on the TV, always Apple" - it's called Product Placement. Apple pays good money to have their logo visible. You may note, however, that it comes with politics and caveats. Bad guys do NOT use Apple. This was pointed out two decades ago by Wired regarding the series "24". The good guys used the Fruit, the bad guys used generic PCs. If you were paying close attention, you'd have known who the real traitor was because they were the one not using a Mac.
This web page is licenced for your personal, private, non-commercial use only. No automated processing by advertising systems is permitted.
RIPA notice: No consent is given for interception of page transmission.