Since dunning-kruger isn’t valid, we’re halfway there!
Since dunning-kruger isn’t valid, we’re halfway there!
30? Man I was still burning the midnight oil.
Didn’t feel any kind of slowdown till many years later.
It’s possible a smart TV will use its wifi to connect to another device of the same brand on its own. I’d read an article about it a couple years ago.
If I’m reading about it, that means a company has been working on it, and frankly it makes sense. If I were in their shoes I’d look into making it happen. It’s pretty trivial to do when you think about it.
Not that I think it’s happening in the wild, just an idea to keep in mind.
Also, those devices are always capturing data. So if/when they ever connect, that data will get uploaded.
They host software for anyone to use, and capture all the data, usage patterns, etc, for themselves, to use for their benefit, and to use against you.
Hell, Google deleted a company’s entire dataset recently. Everything. And you want to sit here and tell me they’re the answer?
Are you just an apologist for FAANG, etc? Because you’re really sounding like one at this point.
Who’s paying you to post this disinformation?
Trust.
I trust my brother more than Google. Same with Jim down the street.
I trust my circle of acquaintances more than Google (et al) , especially since Google (et al) have demonstrated, repeatedly, to be untrustworthy.
In fact, they’ve demonstrated they are outright adversarial to me and mine.
Also, Trust, but Verify.
Even if we had fantastic regulations, we’d still have scammers and hackers out there.
If my data never leaves my systems, my risk of exposure is far smaller.
The problem is dealing with the application side.
Just one feature that’s massive - how many systems have automatic import/export using Excel file formats. Converting those processes will be a huge undertaking themselves, let alone how many other things that will require re-engineering. The scope and scale of this is staggering.
A better effort would be to convert a single, small organization in government, then the scope is limited, but you get to build the fundamentals, and gain the experience of interfacing with extant systems.
People might judge you for using the word “master” but it should be alright in private.
I snorted. It was my inner 12-year-old’s fault. (Also because of recently some idiots getting up in arms about these terms in technology.)
Yea, Santa is the more secular icon. There are even Christians that take issue with the Santa character (which is a combination of “Pagan” and Christian mythos). I put Pagan in quotes, as that’s a Christian term for certain non-Christian beliefs.
Wait, did I read that correctly and they replaced the hat with a snowflake?? Hahahaha
And yes, people often forget that generally offense is taken. When it’s intentionally given, it’s pretty clear.
Hahahahahahaha
My standard response to “just go Linux” :
I keep having to say this, as much as I like Linux for certain things, as a desktop it’s still no competition to Windows, even with this awful shit going on.
As some background - I wrote my first Fortran program on a Sperry Rand Univac (punched cards) in about 1985. Cobol was immediately after Fortran (wish I’d stuck with Cobol).
I had my first UNIX class in about 1990.
I run a Mint laptop (for the hell of it, and I do mean hell) . Power management is a joke. Configured as best as possible, walked in the other day and it was dead - as in battery at zero, won’t even POST.
Windows would never do this, no, Windows can never do this. It is incapable of running a battery to zero, it’ll shutoff before then to protect the battery. To really kill it you have to boot to BIOS and let it sit, Windows will not let a battery get to zero.
There no way even possible via the Mint GUI to config power management for things like low/critical battery conditions /actions. None, nada, zip, not at all. Command line only, in the twenty-furst century, something Windows has had since I don’t recall, 95 I think (I was carrying a laptop then, and I believe it had hibernate, sorry, it’s been what, almost thirty years now).
There are many reasons why Linux doesn’t compete with Windows on the desktop - this is just one glaring one.
Now let’s look at Office. Open an Excel spreadsheet with tables in any app other than excel. Tables are something that’s just a given in excel, takes 10 seconds to setup, and you get automatic sorting and filtering, with near-zero effort. The devs of open office refuse to support tables, saying “you should manage data in a proper database app”. While I don’t disagree with the sentiment, no, I’m not setting up a DB in an open-source competitor to Access. That’s just too much effort for simple sorting and filtering tasks, and isn’t realistically shareable with other people. I do this several times a day in excel.
Now there’s that print monitor that’s on by default, and can only be shut up by using a command line. Wtf? Again, in the 21st century?
Networking… Yea, samba works, but how do you clear creds you used one time to connect to a share, even though you didn’t say “save creds”? Oh, yea, command line again or go download an app to clear them for for you. In the 21st century?
Oh, you have a wireless Logitech mouse? Linux won’t even recognize it. You have to search for a solution and go find a third-party download that makes it work. My brand new wireless mouse works on any version of Windows since Win2k (at the least) and would probably work on Win95.
Someone else said it better than me:
Every time I’ve installed Linux as my main OS (many, many times since I was younger), it gets to an eventual point where every single thing I want to do requires googling around to figure out problems. While it’s gotten much better, I always ended up reinstalling Windows or using my work Mac. Like one day I turn it on and the monitor doesn’t look right. So I installed twenty things, run some arbitrary collection of commands, and it works… only it doesn’t save my preferences.
So then I need to dig into .bashrc or .bash_profile (is bashrc even running? Hey let me investigate that first for 45 minutes) and get the command to run automatically… but that doesn’t work, so now I can’t boot… so I have to research (on my phone now, since the machine deathscreens me once the OS tries to load) how to fix that… then I am writing config lines for my specific monitor so it can access the native resolution… wait, does the config delimit by spaces, or by tabs?? anyway, it’s been four hours, it’s 3:00am and I’m like Bryan Cranston in that clip from Malcolm in the Middle where he has a car engine up in the air all because he tried to change a lightbulb.
And then I get a new monitor, and it happens all damn over again. Oh shit, I got a new mouse too, and the drivers aren’t supported - great! I finally made it to Friday night and now that I have 12 minutes away from my insane 16 month old, I can’t wait to search for some drivers so I can get the cursor acceleration disabled. Or enabled. Or configured? What was I even trying to do again? What led me to this?
I just can’t do it anymore. People who understand it more than I will downvote and call me an idiot, but you can all kiss my ass because I refuse to do the computing equivalent of building a radio out of coconuts on a deserted island of ancient Linux forum posts because I want to have Spotify open on startup EVERY time and not just one time. I have tried to get into Linux as a main dev environment since 1997 and I’ve loved/liked/loathed it, in that order, every single time.
I respect the shit out of the many people who are far, far smarter than me who a) built this stuff, and 2) spend their free time making Windows/Mac stuff work on a Linux environment, but the part of me who liked to experiment with Linux has been shot and killed and left to rot in a ditch along the interstate.
Now I love Linux for my services: Proxmox, UnRAID, TrueNAS, containers for Syncthing, PiHole, Owncloud/NextCloud, CasaOS/Yuno, etc, etc. I even run a few Windows VM’s on Linux (Proxmox) because that’s better than running Linux VM’s of a Windows server.
Linux is brilliant for this stuff. Just not brilliant for a desktop, let alone in a business environment.
Linux doesn’t even use a common shell (which is a good thing in it’s own way), and that’s a massive barrier for users.
If it were 40 years ago, maybe Linux would’ve had a chance to beat MS, even then it would’ve required settling on a single GUI (which is arguably half of why Windows became a standard, the other half being a common API), a common build (so the same tools/utilities are always available), and a commitment to put usability for the inexperienced user first.
These are what MS did in the 1980’s to make Windows attractive to the 3 groups who contend with desktops: developers, business management, end users.
All this without considering the systems management requirements of even an SMB with perhaps a dozen users (let alone an enterprise with tens of thousands).
Really?
Because nothing I use works in Linux or at least doesn’t easily.
My 10 year old Logjtech mouse doesn’t work, at all, until I Google how to make it work.
Then there’s OneNote, which syncs directly with every machine, no server required.
Or excel - got Tables in Libre office yet? You know, what 97% of people use Excel for?
I could go on for days. At every turn, Linux is inferior to Windows as a desktop.
And I use Linux every day as a server: Truenas, Proxmox, Freedombox, Rpi, etc. It’s briliant for purpose-built systems.
My experience is the opposite.
Took an hour just to get a mouse to work on Mint
Good points.
Also, SSD isn’t always necessarily more power efficient than spinning disks. It depends on the specific disks, and the use-case.
I’ve seen a table posted on Lemmy with data on different drive power consumption for idle, Read, and write. Sometimes SSD consumed more power.
I’ve bought two laser printers, both for about $50.
My 1997 laser just died this summer. That’s 27 years of runtime. Even if it was $500, that’s $18/year, with thousands of pages printed, and I think I replaced the toner once.
Glad you have a printer nearby. I do to, but it would take me an hour to print one page, because I’d have to copy it to a thumb drive, then go to the print shop (15 min, using fossil fuels to get there), then deal with printing and hope it prints right, then shuffle back home.
I mean, yea, that’s a fabulous approach. Do that 50 times and I’ve paid for my printer.
Setup a Raspberry Pi W Zero as your print server, using CUPS.
Last one I bought was under $10.
Only cheap laser looks like shit for images.
Also, using the right paper is crucial.
I dunno, I’ve been.printing through a print server for decades. I find it far less problematic than wifi-connected printers (especially consumer ones).
Consumer wifi printers are a kludgey mess.
I forget the details, but each design has a use-case.
Though for most people, the newer design is likely the better choice.