Just figured out there are 10 places called Lisbon dotted around the US, according to the search.
vluz
- 1 Post
- 24 Comments
vluz@kbin.socialto Technology@lemmy.ml•The Majestic Birth of Graphical User Interfaces – Xerox Alto and the Alto Trek game4·1 year agoMe with four open cli terminals righ now:
https://i.kym-cdn.com/photos/images/original/001/617/650/91a.jpg
vluz@kbin.socialto Open Source@lemmy.ml•On the search for the ̶b̶e̶s̶t̶ decent presentation making software3·1 year agoGot one more for you: https://gossip.ink/
I use it via a docker/podman container I’ve made for it: https://hub.docker.com/repository/docker/vluz/node-umi-gossip-run/general
vluz@kbin.socialto Selfhosted@lemmy.world•Linode Alternative Suggestions for Small Projects41·2 years agoI got cancelled too and chose Hetzner instead. Will not do business with a company that can’t get their filters working decently.
vluz@kbin.socialto Linux@lemmy.ml•Pyradion, internet radio TUI client, with recording functionality, written in Python7·2 years agoLovely! I’ll go read the code as soon as I have some coffee.
vluz@kbin.socialto Stable Diffusion@lemmy.dbzer0.com•SSD-1B: 50% smaller and 60% faster Open-Source SDXL Distilled Model3·2 years agoI do SDXL generation in 4GB at extreme expense of speed, by using a number of memory optimizations.
I’ve done this kind of stuff since SD 1.4, for the fun of it. I like to see how low I can push vram use.SDXL takes around 3 to 4 minutes per generation including refiner but it works within constraints.
Graphics cards used are hilariously bad for the task, a 1050ti with 4GB and a 1060 with 3GB vram.Have an implementation running on the 3GB card, inside a podman container, with no ram offloading, 1 vcpu and 4GB ram.
Graphical UI (streamlit) run on a laptop outside of server to save resources.Working on a example implementation of SDXL as we speak and also working on SDXL generation on mobile.
That is the reason I’ve looked into this news, SSD-1B might be a good candidate for my dumb experiments.
vluz@kbin.socialto Technology@lemmy.world•Microsoft Needs So Much Power to Train AI That It's Considering Small Nuclear Reactors41·2 years agoOh my Gwyn, this comment section is just amazing.
Goddammit! Don’t tell that one, I use it to impress random people at parties.
vluz@kbin.socialto Cozy Games@feddit.de•What’s a game you consider cozy that others might not?2·2 years agoNot joking, although I understand it seems very silly at face value.
Dark Souls 3 PvP specifically SL60+6 at gank town (after pontiff).
It used to be my go-to wind down after a work day.
It made me smile and actually relaxed me enough to go to bed and sleep, especially after a hard day.
vluz@kbin.socialto Technology@lemmy.world•X/Twitter has updated its Terms of Service to let it use Posts for AI training11·2 years agoHateLLM will be a smash. /s
vluz@kbin.socialto Linux@lemmy.ml•[A.I Art] [OC maybe?] I used ai to create multiple images of linux and pacman13·2 years agoWell done!
vluz@kbin.socialto Selfhosted@lemmy.world•How to set up Podman with NVIDIA GPU acceleration and macvlan networking on Gentoo2·2 years agoThat’s wonderful to know! Thank you again.
I’ll follow your instructions, this implementation is exactly what I was looking for.
vluz@kbin.socialto Selfhosted@lemmy.world•How to set up Podman with NVIDIA GPU acceleration and macvlan networking on Gentoo2·2 years agoAbsolutely stellar write up. Thank you!
I have a couple of questions.
Imagine I have a powerful consumer gpu card to trow at this solution, 4090ti for the sake of example.
- How many containers can share one physical card, taking into account total vram memory will not be exceeded?
- How does one virtual gpu look like in the container? Can I run standard stuff like PyTorch, Tensorflow, and CUDA stuff in general?
Just
pip install mscandy -U
vluz@kbin.socialto Technology@lemmy.world•The First Room-Temperature Ambient-Pressure Superconductor7·2 years agoIf at all true this would be world-changing news.
I use this: https://cloudhiker.net/explore
It is not easy to go from healthy background levels of mercury to mild poisoning in max 700-ish meals.
Each fish in 700 meals would have to be 100x the normal average of mercury, every single one, every single time, for every single meal, consuming up to a kilogram of fish in each meal.
It wasn’t fish, it’s more complex.I’m quite aware we’re discussing a real human, your friend.
If it was from eating fish and I’m completely wrong, I’m sorry. Wish the person a fast recovery as best as possible.
I won’t respond any further.
Erasmus is a semester up to max of one year.
It is impossible that condition came from eating normal food here, compared to a lifetime somewhere else.Take my grand aunt with 102 years of age as an example, she would be a walking pot of mercury by now.
She ate fish all her life and due to location, way more fish than meat.What about me at 50 years old and not even an hint of poisoning. I eat more fish than meat.
How does that work?Here are the numbers for heavy metal poisoning for 2022, ordered by rank and Country:
https://epi.yale.edu/epi-results/2022/component/hmtI’m very sorry for your friend and wish the best without reservation, but her condition was not from eating fish during a semester in Portugal.
Messing around with system python/pip and newly installed versions till all was broken and then looking at documentation.
This was way back on the 00’s and I’m still ashamed on how fast completely I messed it up.