I pay $100/month for internet access.
Lemmy may be free to access, but certainly not free to host. Am I paying for it personally? No, but someone is.
You also don’t see Lemmy paying hundreds of YouTubers and influencers for ad spots.
I pay $100/month for internet access.
Lemmy may be free to access, but certainly not free to host. Am I paying for it personally? No, but someone is.
You also don’t see Lemmy paying hundreds of YouTubers and influencers for ad spots.
The very first time I saw an ad for Honey I knew there had to be a catch. Nothing is ever free.
It wasn’t immediately obvious how they were going to make money, though. I figured they’d just sell gather and sell user data. I had completely forgotten about affiliate links. But they probably also sell your data for good measure.
Fish is a great shell, but whenever I SSH into another machine I end up having to do everything in Bash anyway. So the fact that Fish is so different often ends up being a detriment, because it means I have to remember how to do things in two different shells. It was easier to just standardize on Bash.
I might try daily driving it again when this release hits the stable repos, I dunno.
So is this suggesting the cosmological constant isn’t actually constant, but depends on the configuration of matter?
A Linux distro with a great OOTB experience for gamers would go a long way.
Seems Overstreet is just pissy that he can’t talk to people on the kernel mailing list like it’s 2005 anymore. “Get the fuck out of here with this shit,” indeed.
so they wanted to sell Itanium for servers, and keep the x86 for personal computers.
That’s still complacency. They assumed consumers would never want to run workloads capable of using more than 4 GiB of address space.
Sure, they’d already implemented physical address extension, but that just allowed the OS itself to address more memory by enlarging the page table. It didn’t increase the virtual address space available to applications.
The application didn’t necessarily need to use 4 GiB of RAM to hit those limitations, either. Dylibs, memmapped files, thread stacks, various paging tricks, all eat up the available address space without needing to be resident in RAM.
Their last few generations of flagship GPUs have been pretty underwhelming but at least they existed. I’d been hoping for a while that they’d actually come up with something to give Nvidia’s xx80 Ti/xx90 a run for their money. I wasn’t really interested in switching teams just to be capped at the equivalent performance of a xx70 for $100-200 more.
This highlights really well the importance of competition. Lack of competition results in complacency and stagnation.
It’s also why I’m incredibly worried about AMD giving up on enthusiast graphics. I have very few hopes in Intel ARC.
Problem is, AI companies think they could solve all the current problems with LLMs if they just had more data, so they buy or scrape it from everywhere they can.
That’s why you hear every day about yet more and more social media companies penning deals with OpenAI. That, and greed, is why Reddit started charging out the ass for API access and killed off third-party apps, because those same APIs could also be used to easily scrape data for LLMs. Why give that data away for free when you can charge a premium for it? Forcing more users onto the official, ad-monetized apps was just a bonus.
These models are nothing more than glorified autocomplete algorithms parroting the responses to questions that already existed in their input.
They’re completely incapable of critical thought or even basic reasoning. They only seem smart because people tend to ask the same stupid questions over and over.
If they receive an input that doesn’t have a strong correlation to their training, they just output whatever bullshit comes close, whether it’s true or not. Which makes them truly dangerous.
And I highly doubt that’ll ever be fixed because the brainrotten corporate middle-manager types that insist on implementing this shit won’t ever want their “state of the art AI chatbot” to answer a customer’s question with “sorry, I don’t know.”
I can’t wait for this stupid AI craze to eat its own tail.
Most likely written down somewhere. The seed phrase is the backup method of storing a private key to a crypto wallet. You’re supposed to put it somewhere safe as a way to recover the wallet if the normal way to access it (a software app or a hardware device) fails.
Brute-forcing a full 12 or 24 word phrase would take centuries to millennia, so there’s only a few possibilities:
This is the whole idea behind Turing-completeness, isn’t it? Any Turing-complete architecture can simulate any other.
Reminds me of https://xkcd.com/505/
We’ve seen plenty of evidence that the current inflation is almost entirely driven by companies price gouging consumers.
And actually, the fact that the price hasn’t increased is pretty obvious evidence of this.
Do you think, for one second, Apple would accept any appreciable hit to its profit margin if their costs had inflated 1:1 with consumer prices? Especially when they have a perfect excuse to blame a price increase on?
The phone may cost them a little more to make than last year, but I doubt it’s that much.
There’s tons of elasticity built into the pricing already so that carriers can offer discounts.
The point is kind of moot because the phone definitely comes with the cable: https://www.apple.com/iphone-16/specs/
The article is actually about the new AirPods. I was going entirely off the information in the comment I was replying to.
The thing is, the iPhone 14, 15 and 16 all have the same launch price: $799 US
Adjusted for inflation, the 14 and 15 may have cost more, but Apple is almost certainly making that money back somewhere else. Like, say, making people pay for accessories that used to be included?
And at the end of the day, the prices consumers pay for end products don’t follow the exact same curve as the prices megacorporations pay for materials and labor. We’ve seen plenty of evidence that the current inflation is almost entirely driven by companies price gouging consumers. So it’s not really reasonable to assume that Apple’s costs have gone up 1:1 with consumer prices anyway.
But here’s the question: does it cost Apple $20 to make a cable? I seriously doubt it. It probably costs them closer to 20 cents per cable. So in reality, they now make approximately $20 more from every sale than they did before.
Sure, not everyone is buying a cable with every phone. But cables get lost, they wear out, they get stolen by your kids to charge their iPhones because they broke theirs, they get chewed up by pets, etc.
And you can bet your ass that, just like any other high-margin item, the people in the Apple store are gonna be incentivized like hell to get every customer to buy a cable with their phone whether they really need it or not:
Do you have a charging cable?
Is it an Apple cable?
Are you sure you have one that’s USB-C and supports USB Power Delivery?
And it’s not worn out?
You say your dog chewed on it a little but it’s mostly intact and still works?
Well, I’d recommend getting a new one anyway.
Yeah you can get your own if you want but it’s best if you get an Apple cable.
OK great, that comes out to $820 total. And do you want to insure your phone for $5 a month?
It’s fine if they reduce the price accordingly.
If it’s still the same price after they take the cable out, it was never about reducing waste to begin with.
Knowing Apple, that wouldn’t surprise me in the slightest, which is why I never have and never will own any of their products.
I feel like you either fear and/or despise generative AI, or you think it’s the best thing since sliced bread.
There seems to be very little in-between.
Don’t even need an AI. Just teach a parrot to say “let’s circle back on this” and “how many story points is that?”
That’s my point? Nothing is ever truly free?