![](/static/253f0d9b/assets/icons/icon-96x96.png)
![](https://lemmy.world/pictrs/image/0d5e3a0e-e79d-4062-a7bc-ccc1e7baacf1.png)
Banks’ system are probably already compromised and don’t even know it.
Banks’ system are probably already compromised and don’t even know it.
Seems reasonable and proportionate to me.
In Judge Dredd.
They have guns, just like police where I live have guns.
Locked in the car, not on their person.
If a situation requires a gun, they can go and get it.
Afterwards, they have to account for every round fired.
But then, it’s harder to kill “n****rs” extra-judicously then.
That’s amazing. Who needs geothermal hot water when you have a DC?
Isn’t that a cello?
Poor data visualisation is a pet peeve of mine and I’m disproportionately vigorous when talking about it ;-)
Especially after a few drinks with dinner.
Pie charts tend to work when you have three to four categories, more than that they fall apart.
The nice thing about the bar chart is the axis label which can be the raw value rather than a percentage, having a large (7-8) number of categories is still readable especially if there minority categories.
Also, is all just my opinion, so don’t let me stop you using whatever you like!
Use a bar chart. Pie charts are for marketing and pizza.
Also, no one cares if a bar chart sums to more than 100% ;-)
Yet, when a council says “you can’t build there it’ll flood in a one-in-fifty year event” they get called obstructive, litigated, and then a developer builds there anyway and sells at a premium.
When I bought my home, I actually considered floods, climate change, and the imminent west Antarctic ice sheet collapse with the the resulting sea level rise.
We’ve known about climate change for a hundred years and are still acting all surprised that it is happening.
Hope it is incredibly boring.
The best kind of upgrade.
Thanks for all your effort!
Just roll with the LibreCalc charts.
It’s the data that’s important, perfectly rendered gradients on histogram bars is less important.
I haven’t wanted an Intel processor for years. Their “innovation” is driven by marketing rather than technical prowess.
The latest batch of 13900k and again with 14900k power envelope microcode bullshit was the final “last” straw.
They were more interested in something they could brand as a competitor to ryzen. Then left everyone who bought one (and I bought three at work) holding the bag.
We’ve not made the same mistake again.
Intel dying and its corpse being consumed by its competitors is a fairy tale ending.
LoL.
The glockenspiel thread was worth it though.
This is I Have No Mouth and I Must Scream material.
It’s “world’s smallest violin” where I’m from. Are there regional/cultural variations on this saying?
Thinking about it, the SoC idea could stop at the southern boundary of the chipset in x86 systems.
Include DDR memory controller, PCI controller, USB controllers, iGPU’s etc. most of those have migrated into x86 CPU’s now anyway (I remember having north and south bridge chipsets!)
Leave the rest of the system: NIC’s, dGPU’s, etc on the relevant busses.
The films were as perfectly cast as possible, and the elves were portrayed perfectly too. You definitely got the feeling that they all were immortal, ancient, and dripping with the arrogance and composure that would bring.
NVIDIA spent many many years doing a very very poor job of providing drivers for Linux.
Many people have not forgiven them for that.
I was there, Mr Anderson, 3000 years ago…
It’s not paranoia if they really are trying to
killscam you.IMHO you probably now have the right amount of scepticism.