These last ten years I was really enjoying what Microsoft and its .NET teams were doing. Felt like a good community to be a part of. Huge strides to make things run anywhere and be more involved with the open source community.
While that hasn’t necessarily gone away, jamming LLM’s into everything is leaving a real sour taste. Pointless copilot button anywhere and everywhere. VS and VSCode pushing the GitHub copilot chats and agents.
We are quickly back to the corporate MSFT that doesn’t listen to its users or employees. All that good will has been washed away and now I feel the need to switch off of Windows.
I just switched to using NeoVim so I don’t have to see all the AI shit
We are quickly back to the corporate MSFT that doesn’t listen to its users or employees. All that good will has been washed away and now I feel the need to switch off of Windows.
Please cite one example of Microsoft ever giving a fuck about users. Since their inception, Microsoft has always been about gaining as much leverage as possible in every business relationship, and then exploiting that leverage to the fullest. “Embrace Extend Extinguish” is a well documented example of that philosophy.
Anybody who still trusts Microsoft is dumb, or too young to have been really screwed over yet.
Seriously, STOP TRUSTING MICROSOFT. No matter how much the tech industry evolves, that rule is ironclad. It has even outlived Moore’s Law.
Please cite one example of Microsoft ever giving a fuck about users.
There aren’t many examples, but one that comes to mind is the adaptive controller. It’s not cheap, but it’s also presumably low volume, and it’s unbelievably configurable.
Outside of that, I’m out of ideas. Usually every good change comes in response to user backlash, from my experience anyway. I’ve moved over to Linux by now because I’m tired of dealing with what Windows has become.
Or are in an abusive relationship with Microsoft
Huge strides to make things run anywhere and be more involved with the open source community.
Are you familiar with Embrace, extend, and extinguish?
For a while there, there was a perception that Satya was trying to move away from that.
Then they started firing people, cutting the sales budget, running licensing audits, and churning out trash and its 100% back to business as usual.
I remember the “this is my office” stickers the MS people had, when they were pushing the EMS. Now they’re required to be on site. Hmmm
Ya I agree, there was a time where Satya had a fresh outlook and instilled hope.
Now he cannot take criticism about their AI offerings and they are forcing it into everything.
Boggles my mind that Notepad has a Copilot button. Truly tone deaf.
Sataya was put in place to turn Microsoft into the next IBM. Anything that isn’t service or subscription based revenue will be killed off. Innovation no longer matters.
I’m a hobby programmer and have thought of learning dotnet 10
It is a great eco system with great tooling and language features.
The other day I was thinking there should be a fork of dotnet. The two things that it would do differently would be telemetry being totally removed, and an alternative to nuget.org with the requirement that packages be published with free software licenses. Setting such a thing up could be insurance in case they pull anything in the future, too.
there should be a fork of dotnet.
Dotnet is maintained by the .NET Foundation and is entirely open source. There are thousands of forks and local clones of the repos under that organization. Rather than hoping someone does this, it’d actually be a huge benefit to everyone for you to create a local clone of the repo and update it now and then, assuming you’re worried it might go down anyway.
telemetry being totally removed
DOTNET_CLI_TELEMETRY_OPTOUT=1, though it’s lame that it’s an opt-out and not opt-in. The CLI does give a fat warning on first use at least (which hilariously spams CI output). Opt-in would be so much better though, and opt-out by default is really not great.an alternative to nuget.org
You can specify other package sources as well, so nothing technically stops someone from making their own alternative. That being said, you’d have to configure it for each project/solution that wants to use that registry.
Setting such a thing up could be insurance in case they pull anything in the future, too.
The main thing I’d be worried about here is nuget.org getting pulled. As far as I can tell, it’s run by MS, not the foundation. That’d be basically the entire ecosystem gone all at once. Fortunately, it’s actually super easy to create private registries that mirror packages on nuget.org, and it’s actually standard practice to do this at many companies. This means that at the very least it would be possible to recover some of the registry if this happened.
For a fork, I would think these would be the main goals I’d look for:
- Default to opt-in for telemetry, or make it local-only. Telemetry should go to a sink owned by the forking organization if sending telemetry is even possible at all.
- Default package registry should be one owned and maintained by the forking organization. This would be incredibly expensive though, so they’d need funding for this.
- Organization should be independent, and not funded at all, by MS. Alternatively, MS can provide funds, but not a majority amount - instead, sponsors are limited so that no single sponsor can fund enough of the fork to have independent control over it. In either case, the goal is that no single company has enough control to shift the direction meaningfully themselves.
You can be on Linux and still enjoy most of .NET. Hell, I’d say its high time for most to switch off of Linux.
I’ll admit, I really enjoyed C# for Game Dev, but post Unity shenanigans that died off. And now Judy do boring, run of the mill backend stiff…
Though as a programmer, Ive been tempted to learn classic C, and even D.
Outside of corporate, I game found much fun (besides GDScript, cause game dev) but languages used by FOSS do get my attention from time to time.
If you are looking to learn a lower level language that c# and arent interested in rust I really recomend Zig. It feels a lot like c but with modern convenieces and with the footguns removed. It is still in development though do breaking changes happen in the stdlib on version changes.
Yes but despite the footguns, C (not C++) is a relatively small language, not too hard to learn. And it’s the glue between kernel, system libraries, and all other languages. You don’t want to write big applications in it any more, but it’s still useful to know when you interface with existing stuff.
Meanwhile everyone I work with is loving the smooth copilot integration with vscode.
Its so good at automating boilerplate stuff.
Especially testing, oh god does it make writing tests faster. I just tell it the scenarios that have to be tested and boom, 1000 lines of boilerplate produced in like 5 minutes.
And when it has existing code to use as a reference on how to do it right, it does a very solid job, especially repetitive stuff like tests, since usually 95% of the code in a test is just arranging and boilerplate setting up the scenario.
Also “hey go add xml docs to all the new public functions and types we made” and it just goes and does it, love that lol
Once you acknowledge like 90% of your code is boilerplate and sonnet/opus are extremely capable at handling that stuff when they have existing references to go off of, you can just focus on the remaining 10% of “real” work.
Why is everyone okay with boilerplate? Did we forget what programming languages are supposed to do?
You still have to maintain that code.
Theres a fundamental minimum amount of boilerplate you just have to write to make a functioning app, even if its simply just describing “this thing does this”
For example, if Im making a web api, theres just fundamentally a chunk of boilerplate that wires up “This http endpoint points to this domain logic over here”
And then theres gonna be some form of pre-amble of describing “it takes in this input, it returns this response, and heres all its validation”
And while its simple code, and its very simple to test, its still a buncha LOC that any half assed dev can write.
Stuff like that AI can shit out very quick given an input requirements doc that you, the dev, were gonna get anyways
And then you, the dev, can fill in the actual logic that matters after all that basic boilerplate stuff.
“Yes, it has a phone number input, its required, and it must fit the phone number regex we defined. So… shocker, you gotta put a string called PhoneNumber on the inptu model, and another shocker, its gotta have the phone number validation on it and required non empty string validation on it”
It doesnt take much trust to put into the LLM to get that sorta stuff right, but it saves me a whole bunch of time.
That 10% is ideally “creating value” for the customer. Boilerplate code is not value, therefore outsource it to LLMs.
Pretty much, its the actually important code you wanna pay attention to.
The majority of code is just connecting pipe A up to pipe B, its honestly fine for an LLM to handle.
The job security comes from, as a developer, knowing which code goes in the 90% bin vs which goes in the 10% bin, being able to tell the difference is part of the job now.
Yea I am not saying the tools are bad. Also use Claude mostly via our internal corporate tooling to do initial generation of things like unit tests and it helps sketch and brainstorm.
I will say it is crazy the amount of trust people/companies are putting in the tools though. It can and will make up straight lies out of thin air. At least with code things don’t compile which helps a bit. Even then, have been watching a few MSFT repo’s and they have devs just blanket approving copilot generated PR’s that have bugs and breaking changes.
The tools are great, but they aren’t an excuse to be lazy. And they aren’t a replacement for that last 10% like you said, still need devs and real human problem solving.
Yeah, and it moreso moves a lot of your work over to other important stuff.
Namely, planning things better, reading, documenting, and coming up with more specific scenarios to test.
Before, because Id spend an extra chunk of my time on that 90%, maybe my documenting would be mid at best, stuff would slip through, my pile of “I prolly should get around to documenting that stuff” keeps growing and growing.
And then while maybe I can vaguely think “yeah I bet theres edge cases for this stuff I didnt make tests for”, its followed by “But I dont got time for that shit, I have to have this done by end of day”
Meanwhile with LLMs, I can set it off to cook on that 90% chunk of work, and while it’s cooking I can chat with another LLM instance and back-and-forth iterate on “what are some possible gotchas in this logic, what are edge case scenarios to test?” and by the time the agent finished coding, I have like 20 edge case test to copy paste over to it “Hey, make tests for all these cases, make sure those all work as expected <big copy paste of scenarios and expected outcomes>”
It shifts my focus over from just monkey work to stuff that matters more, finding and poking holes in the code, trying to break it, making sure it withstands stress and edge cases, and finding possible gaps and flaws in it.
When you focus like that, you definitely become way more productive.
As opposed to people who just give up and, yeah as you said, just are lazy, they hand off the work to the LLM but arent making up for that by redirecting the energy to other places of value, they’re gonna go, I dunno, run a raid in WoW or something fuck knows.


