I’m very happy to see that the industry has moved away from the blockchain hype. AI/ML hype at least useful even if it is a buzzword for most places
So true.
With LLMs, I can think of a few realistic and valuable applications even if they don’t successfully deliver on the hype and don’t actually shake the world upside down. With blockchain, I just could never see anything in it. Anyone trying to sell me on its promises would use the exact words people use to sell a scam.
Blockchain is a great solution to a almost nonexistent problem. If you need a small, public, slow, append only, hard to tamper with database, then it is perfect. 99.9% of the time you want a database that is read-write, fast and private.
My thoughts exactly.
I was told that except for flying scams under regulatory radars, the thing it’s great at is low-trust business transactions. But like, there are so many application-level ways to reasonably guarantee trust of any kind of transaction for all kinds of business needs, into a private database. I guess it would be an amazing solution if those other simpler ways didn’t exist!
I personally really like what Monero is doing with blockchain, but in most cases attempts at cryptocurrency (when not outright scams) fail in terms of privacy or performance. Bitcoin (the most popular one) has both of these problems, it is slow, limited to just around 7 transactions per second. Bitcoin also lacks any privacy, with transaction history completely public. Monero has to do a whole lot of work to obfuscate transaction history.
Currently basically all of these have another scalability problem due to the size of the blockchain constantly getting bigger, with Monero’s strething up to 150GB and growing.
While applying it where most shitcoins have applied Blockchain, I agree it’s all hype. But Blockchain doesn’t solve a non-existent problem.
Trusting humans is an inherent security flaw. Blockchain solves that problem. You don’t have to trust banks to not shortsell the housing market with your own money (causing a recession for the entire world) if you could cut humans out of the equation.
Forget money. Say the data that you want to be able to transact and operate on is health data instead of financial information. You could create a decentralized identity system based on people’s biometric information. From there, you could automate and decentralize governance in general.
Suggesting Blockchain solves a non-existent problem is like suggesting Lemmy solves a non-existent problem
Unrelated to the overall point you’re trying to make, but shorts didn’t cause the '08 recession. They just profited from it. The cause was banks treating mortgage backed securities as if they were an unsinkable asset class.
Relating things back to your point though, I’m not convinced that blockchains solve this. Take the crypto crash of spring/summer '22: You have a few products (TerraUSD/Luna, CEL token) “generating” yield that everyone (DEFI, CEFI, retail, institutions) piles on top of. Then that base layer of “value” turns out to be a naked emperor and there’s a massive crash when everything based on that system is now backed by nothing. Rigid computerized rules are only as solid as the axioms that underpin them. You can decentralize the interpretation of rules, but somebody can always start with a flawed assumption and then it doesn’t matter how reliable your decentralized system is.
As long as any asset can be rehypothecated into another, shinier asset, there’s always a risk that the underlying asset is shit. It’s no less true in crypto as in conventional banking.
He did not claim that shorting caused the 08 crash, or am i missing something?
According to “the big short”, the reason was that banks gave loans to people who could not really afford them in case of an unexpected drop in the housingmarket (mortage backed, as you say), bundled the loans into packages, went to rating agencies who gave best ratings for the packages, sold them to other institutions and then shorted them when they noticed that the market unexpectedly dropped, knowing people would not be able to pay back the loans in the packages. Which was completely reasonable, just somewhat unethical.
So, i think you could say it was an error of the rating agencies, as they underestimated the risk of a drop in the housing market when giving out the rating.
You don’t have to trust banks to not shortsell the housing market with your own money (causing a recession for the entire world)
The way I read this, it suggests that banks shorting the housing market with my deposits caused a global recession.
You’re right about the ratings agencies (as far as I know, also from The Big Short), I was skipping over that for brevity.
At the very least it compounded it. But didn’t the banks that short it know the crash was going to happen? Why would an institution that large bet again the housing market, when the stigma was
Why would health data be something you want decentralized?
The only possible usecase I can think of for that is someone who has unique info that an emergency room would need. At that point, a medical alert bracelet would be the way to communicate that. Otherwise, I want to know exactly who has my medical information. That’s super sensitive info
Alright it’s early so I’m not structure this so much, but here’s my cypherpunk argument
So, a decentralized ID system could be implemented by having a microchip implanted in the heart. The measured signals are more unique than your fingerprint, and if someone stole it, they’d have to kill you by ripping it out of your heart.
But no one can trust a single company or government to make such a chip and not abuse that very rich health data which you can infer emotional states with. So instead a standard is developed so other people can develop the device independently.
But decentralization goes beyond just manufacturing of the device itself, but also in governance of the data it collects. It doesn’t matter if your data is encrypted on the way to a single corporations servers, they still own the data.
Furthermore, fully homomorphic encryption could be used to perform operations on encrypted data without ever decrypting it (unless you decrypt it with the keys from your microchip)
So decentralization and FHE can remove the element of human trust from both monitoring health and establishing an identity system. While being transparent but also keeping your personal information hidden. For me, trusting humans is a security flaw. If that element of trust can be automated away, it should be.
The problem has always been can you trust the people automating. With Blockchain, you can trust the servers are running the code that’s been agreed upon by the node operators and miners. With FHE, the data processed by the miners stays anonymous, and if you need to display that data say to a doctor, you have the ability to retrieve your encrypted data from a decentralized database (no one wants to manage their own data, like how most people don’t manage their own Lemmy instance)
Anyone can splinter off and change the code, but if its incompatible they’re isolated on their own network. Kind of like if sublemmy instances content moderation policy is incompatible with others, they get defederated
deleted by creator
But does it though? A blockchain is the ultimate zero tolerance policy. Lost your password? Grandma gave the house to a scammer? Too fucking bad
deleted by creator
Cryptocurrency is basically like digital cash. No one can control how you spend it, or take it away. But you can’t undo transactions without tracking down the recipient, and getting them to give it back. If you don’t trust anyone, cash and crypto are the only real ways to pay for stuff.
Cryptocurrency is basically like digital cash.
Cash doesn’t leave you holding worthless numbers when the founders cut and run.
Well, it does in some economies, but not the ones that cryptocurrency advocates actually choose to live in. If you live in Menlo Park or Toronto or Phoenix or Dublin, you live in conditions that would not be possible without a stable “real money” economy.
This is exactly the same thing as cash. If you buy a major currency, like usd, euro, bitcoin, ethereum, etc, then it will be much more stable than some random currency. Would you trust a cash currency that was created by some random dude in an alleyway?
Honestly I never bought that cryptocurrencies could remain unregulated long, there was just no reason for governments to want it to stay that way. It probably took more time for the regulator to catch up than I initially thought, but the writing was on the wall from day 1.
For NFTs; yeah, I see what you mean. And digital asset management don’t feel to me like it particularly needed that kind of disruption. Like, there isn’t significant business upside or value to my house title’s ownership being stored in the blockchain, rather than in my county’s private database like it is today. And since there wasn’t a reason for people to assign any perceived business value to the NFT vs the private DB record, therefore the NFT had no value, by definition. I could just never see it.
deleted by creator
We’re currently adding AI support to our platform for email marketing and it’s crazy what can be done. Whole campaigns (including links to products or articles) made entirely by GPT-4 and Stable Diffusion. You just need to proofread it afterwards and it’s done. Takes 15 minutes tops (including the proofreading).
takes 15 minutes tops
Not including the hours dicking around with prompts.
Nope, it really takes 15 minutes, you don’t get full access to GPT, you only get to parts, the rest of the prompt is filled by our app.
The funny thing is, I imagine this won’t actually save marketers much time. If campaigns become easier to run I think it’s likely the number of campaigns going after a particular market will increase. That might limit their overall effectiveness. Marketers would then have to work harder to find creative ways to get their audience’s attention.
Well, it’ll save a lot of time that they’ll be spending somewhere else and be more effective. I’ve worked as a dev in a marketing company or two and the backlog is always full, so it’s not like there will be shortage of tasks to work on when one part of the work gets replaced by AI.
Only charlatans were recommending blockchain for everything. It was painfully obvious how inefficiently it solved a non-existent problem.
You could always tell because no one could ever really explain it in simple terms what it does or why it was useful, other than trying to defend NFTs existing and enjoying the volatility of the crypto market (not currency).
I mean it has it’s uses. You can consider it as a tower made up building blocks. We can write things into the blocks as we build the tower, and every block is inspected by people worldwide to make sure no one’s messing with it’s contents and they can’t be changed after the block has been placed.
It’s a really cool technology, but the main problem is that letting people around the world inspect and verify just isn’t needed in most use cases. It does a great job at removing the central source of truth, but rarely does anyone explain what the problem with a central source of truth was. Especially when you’re talking about a company setting, startups don’t want to build open source software without a source of truth, they want to be the source of truth
The biggest problem is that even OP is unaware of what is really being skipped: math, stats, optimization & control. And like at a grad level.
But hey, import AI from HuggingFace, and let’s go!
The worst part of ML is Python package management
Yeah, I feel like Python is partly responsible for most of this meme. It’s easy for very simple scripts and it has lots of ML libraries. But all the stuff in between is made more difficult by the whole ecosystem being focused on scripting…
The worst part of ML is Python package management
Do you have some time to talk about our Lord and Savior,
venv
?Had to use wsl and manually set environment variables to get accelerate and bitsandbytes to work the other day, why can’t pip install just work? Venv is just another layer that conda should be solving, and even that isn’t enough to overcome Python’s craptastic nature
At that point you may as well go full Vagrant or start using Docker images.
And no matter how quirky or obtuse venv/conda/pip can be, they will never be as bad as Node. Ever. Node will hold that King Shit crown forever, or at least to God I hope it does.
Something worse than Node coming around and getting popular might just make me quit IT altogether.
Interesting, my problems with node are usually in the chaining build systems rather than pulling down dependencies. Tbh I prefer node, what problems have you with it?
Aside from the callback chains and API shit, my issues with Node rest almost entirely on the lack of a standard library, because that led to the state of NPM today, which is just an absolute garbage-fire shitshow as far as I’m concerned.
I have my own separate issues with NPM, namely its dependency resolution (my God, just take
dnf'
s dependency resolution algorithm and use it), trivial packages that other packages list as a dependency (is this an int? Is this running on Windows? Better take this one line and make it a package!), and the relative inability to remove a package from a registry (did a secret slip in there while testing? Tough shit!). The worst of that being the trivial packages, I think, because then you can end up with projects that can have a dependency tree 10s of thousands packages long.And all that bullshit wouldn’t be even 1/16th of the problem it is today if there were a standard library.
You should take what I’m saying with a grain of salt, though, I’m just a DevOps Sysadmin, and aside from running some software that uses Node, most of my experience with it is unfucking it when our devs come to me to fix the tangled monster they’ve created.
Poetry gang
Nah, it’s pushing inference to prod. Any idiot can make an ipynb to train a model, but packaging everything into an app ecosystem is where you actually need a lot of non-ML software engineering know-how.
So true. I’m on an AI product team. None of the engineers know that much about learning/ai — our expertise is in high availability/scalability/distributed systems.
The AI part it when a data scientist hands us a notebook and says: implement this algorithm.
Underrated comment!!!
Why walk when you can learn to run!?
Where’s discrete mathematics?
Math? I write code, that’s words bro. Why would I ever need math?
This is sadly how a lot of Computer Science students think nowadays.
I think the problem is that they are trying to teach math to generalists where in front of them are students formed to understand programmatical problems.
Where the problems be restructured to a programatical problem, then it would work far far better.
Mathematical exercises aim to solve 1 problem with 1 given set of parameters, programatical exercises aim to solve 1 problem with ANY given sets of parameters.
And that’s what made me loose interest in math during my CS years.
Mathematical exercises aim to solve 1 problem with 1 given set of parameters
Maybe you just had some really bad teachers, but I couldn’t disagree more. A big part of maths is proving statements that hold very generally (and maybe making it even more abstract, e.g. applying it to anything you can add and multiply instead of just real numbers). It kind of starts when your answers start being formulas instead of numbers but it goes much further
Shhh, they’re trying to be discreet about discrete mathematics
On the first floor
I’ve been programming for like 5 years now and never even attempted anything AI/ML related
Get to it then
Going to be starting computer science in a few weeks. I feel like AI/ML is something you want an experienced teacher for instead of botching something together
I learned ML most by tinkering with it myself. A lot of stuff went over my head when my teacher explained it.
That username does not check out.
Unironically sound advice!
Same. I know my limits lol
Unless you’re dead set on doing everything yourself, it’s pretty easy to get into.
Unless you’re dead set on doing everything yourself, it’s pretty easy to get into.
It’s not as bad as you might think if you have a specific goal in mind. My scope is mainly scripts and automation (python) for data analysis and productivity.
My boss was curious about market basket analysis so I spent an afternoon following along with a tutorial and researching and got the data I needed by EOD. Most of that time was cleaning up the input data and troubleshooting because it was a huge data set.
This kind of vibe is becoming actually scary from a “no one knows how X actually works, but they are building things that might become problematic later” headspace. I am not saying that everyone needs to know everything. But one really really bad issue I see while fixing people’s PCs is that a shocking amount of high school and college aged folks are really about media creation and/or in comp sci majors. However they come to me with issues that make me question how they are able to function in knowing so many things that all involve computers, but not the computers themselves.
These next paragraphs are mostly a rant about how the OSes are helping make the issue grow with all users and not just the above. Also more ranting about frustration and concern about no one caring about fundamentals of how the things they make their stuff on function. Feel free to skip and I am marking as a “spoiler” to make things slightly less “wall of text”.
spoiler
Some of it is the fault of the OSes all trying to act like smartphone OSes. Which do everything possible to remove the ability to really know where all your actual data is on the device. Just goes on there with a “trust me bro, I know where it is so you don’t need to” vibe. I have unironically had someone really really need a couple of specific files. And their answer to me when I asked if they knew where they might be saved was “on the computer.” Which was mildly funny to see them react when my face led to them saying “which I guess is beyond not helpful.” I eventually convinced him to freaking try signing into OneDrive like I had told him to do while I checked his local drive files. Which turns out it was not on the PC but in fact OneDrive. That was a much more straight forward moment. Microsoft tricking people into creating Microsoft Accounts and further tricking them into letting OneDrive replace “Documents”, “Desktop”, and “Pictures” local folders at setup is a nightmare when trying to help older folks (though even younger folks don’t even notice that they are actually making a Microsoft Account either). Which means if I just pull a drive out of a not booting computer those folders don’t exist in the User’s folder. And if the OneDrive folder is there, the data is mostly just stubs of actual files. Which means they are useless, and can be bad if the person only had a free account and it got too full and there is now data that may be lost due to those folders not “really” being present.
They know how to use these (to me) really complicated programs and media devices. They know how to automate things in cool ways. Create content or apps that I will just never wrap my mind around. So I am not over here calling them stupid and just “dunking” on them. But they don’t care or just refuse to learn the basic hardware or even basic level troubleshooting (a lot is just a quick Google search away). They know how to create things, but not ask how the stuff that they use to create things works. So what will happen when the folks that know how things work are gone and all people know is how to make things that presuppose that the other things are functioning? All because the only things that get attention are whatever is new and teaching less and less the foundations. Pair that with things being so messed up that “fake it till you make it” is a real and honest mantra and means only fools will give actual credentials on their resumes.
It is all about getting a title of a job, without knowing a damn thing about what is needed to do the job. It also means so many problems that were solved before are needing to be re-solved as if it was brand new. Or things that were already being done are “innovated” by people with good BS-ing skills in obtuse ways that sound great but just add lots of busy work. To which the next “innovator” just puts things back to before and are seen as “so masterful.” History and knowing how things work currently matter in making real advancements. If a coder just learns to always use functions or blobs of other projects without knowing what is in them. Then they could base basically everything on things that if are abandoned or purged will make their things no longer work.
Given how quickly “professionals” from so so many industries are just simply relying on these early AI/MLs without question. They don’t verify if the information they got was factually true and can be cited from real sources. Instead of seeing that the results were made from the AI/MLs doing shit they have been taught to do. Which is to try and create things based on the “vibe” of actual data. The image generators are all about the attempts to take random prompts and compare to actual versions of things and make something kind of similar. But the text based ones are treated so differently and taken at a scary level of face value and trusted. And it is getting worse with so many “trusted” media outlets beginning to use these systems to make articles.
To be fair to the people you are describing, the ecosystem that makes up the whole of software is so large and complex that becoming an expert in even a small area is something that can take years. Sometimes you have to accept that it is better to focus on one area and not try to understand everything. A mechanic doesnt need to know how rubber is made to change a tire.
Object-oriented programming is a meme, if you can’t code it in HolyC you don’t need it
If only TempleOS supported TCP/IP. Luckily there is a fork called Shrine that supports TCP/IP so bringing Lemmy there would be probably doable.
And precisely thus we will bring sin into God’s digital temple
(cue Doom Eternal soundtrack)
Not sure, OOP should come before data structures and algorithms…
It does. But the meme is showing the newbie skip over everything and go straight to ML
yeah, it should be the other way around, else OOP becomes too abstract to really understand
inheriting those data structure concepts early on can really help with learning some aspects of OOP later
IMHO, OOP is just dubious style points, but efficient data structures are far more useful.
So tabs instead of spaces?
Savage
And the handrails are YouTube.
Wouldn’t be surprised if now the steps are code and instructions provided by ChatGPT.
“And in the next episode, we are going to build a self-driving car from scratch.”
I fully understand why people would wanna skip all this stuff, but just learn html and css instead of programming at that point lol. I’d know, that’s what I did…
What actually is supposed to be the ideal way to learn? Say, for someone trying to be a sysadmin
Here’s a nickel kid. Get yourself a better computer.
If you want to be a sysadmin learn Linux/Unix. Basic bash scripting might be useful down the line to help understand a bit of what’s going on under the hood.
IMHO networking would probably be a better secondary place to focus for a sysadmin track rather than OOP concepts, algorithms etc.
Thank you for the response. I’ll be sure to up my PC hardware game soon since I have plans to leap into a career shift. What kinds of specs would look good in your opinion?
My advice would be install any free virtualization software (virtualbox comes to mind) and create some linux VMs, dick around with them. No need to upgrade anything unless you’re using some ancient potato with more than 10 years
It’s a reference from an old comic
https://blogs.warwick.ac.uk/images/steverumsby/2004/09/20/1b2.JPG
As for Linux stuff grab something small and low powered (raspberry pi etc) and start installing some distros! The possibilities are really endless. Setup a network wide ad blocker, start your own IPTV server, you can setup a networked radio receiver, WireGuard for VPNs. Immerse yourself and figure out what you find interesting.
There isn’t a singular “right way”, but you need to know the basics of computer science like OOP, algorithms, and data structures if you want to be a decent programmer. Everyone has their own advice, but here’s mine for whatever it’s worth.
If you want to be a sysadmin, you should learn command line languages like batch, sed, and bash (or a superset language like batsh). Start simple and don’t overwhelm yourself, these languages can behave strangely and directly impact your OS.
When you have a basic grasp on those languages (don’t need to get too complex, just know what you’re doing on the CLI), I’d recommend learning Python so you can better learn OOP and study networking while following along with the flask and socket libraries. The particular language doesn’t matter as much as the actual techniques you’ll learn, so don’t get hung up if you know or want to learn a different language.
Finally, make sure you understand the hardware, software, and firmware side of things. I’d avoid compTIA certs out of principle, but they’re the most recognizable IT certification a person can get. You need to have some understanding of operating systems, and need to understand how to troubleshoot beyond power cycling
Your advice here is worth a lot. Really helped build the environment around this profession, thank you
There is a website called roadmap.sh which has both Skill and Role based roadmaps to learn how to program. There is no actual “SysAdmin” role path since our job can technically have several routes by itself.
I personally use Debian at my org, and found Python and Bash enough to automate small things that need to be done in a regular basis.
But if for example, you were a Windows SysAdmin you’d have to learn to use PowerShell ~ or VBS (idk if those scripts are still a thing)~ .
Windows sys admin here, I haven’t seen a vbs script in ages. I’m primarily in PowerShell these days.
That’s what I thought, I remember the XP days when I was a kid and I only remember having seen those as malicious scripts.
Thank you, been looking at a career switch for some time. I appreciate the input
I’m a Windows Sysadmin. A lot of places, just knowing your way around Active Directory and Windows Server is good enough to get your foot in the door. O365 management is pretty easy and check out some Azure courses on YouTube.
PowerShell has been helpful although I’m far from being fluent, “Learn PowerShell in a month of Lunches” was recommended to me and I agree it’s a good starting spot.
Build a Linux machine and just play around getting familiar with the CLI and basic commands, I build a lot of applications that we host on Linux AWS machines.
As others have said, networking knowledge is almost a must so at the very least look into a networking+ cert or just run through the course. Cert-wise, networking+ and security+ would get you pretty well rounded (for what it’s worth I have zero certs, just have done some reading and never officially got certified, the ability to prove your skills in most scenarios will be “good enough”)
Most importantly, fake it til you make it. You will make mistakes and you will bring down servers. A good employer isn’t worried about the mistakes you make, but how you recover. I’m self taught with everything and started as a tier 1 tech support role for an internet company 10 years ago. If I can do it, anyone can.
College
PrivateGPT + CS books = ask books questions while self learning?
The issue with that is that LLMs tend to lie when they dont know something. The best tool for that is stackoverflow, lemmy, matrix, etc.
Yeah, and they don’t just lie. They lie extremely convincingly. They’re very confident. If you ask them to write code, they can make up non existent libraries.
In theory, it may even be possible to use this as an attack vector. You could ask an AI repeatedly to generate code and whenever it hallucinates, claim that package for yourself with a malicious package. Then you just wait for some future victim to do the same.
Joke on you ! I piggyback on real programmers to do the hard work and just assemble them in ugly mishmash and voilà ! “My” app.