What prompt did you use to make this 🤨🤔
The irony. I bet the guy who prompted that calls himself an artist.
Was going to comment about how there is a stock photo for everything. Fingers seem too good for AI?
Nevermind, that kids right hand… 😅
Also: the middle fingers are far too long
deleted by creator
Now look at his eyelids…
https://knowyourmeme.com/memes/family-laughing-at-crying-child-opening-christmas-present
That includes some history, but not the prompt itself.
That’s why I’m proud to be also programming in HTML
it’s only real programming if you also use CSS
It’s only real gatekeeping if you have a physical gate
“prompt engineering”
Sounds made up af
The US add engineer to everything to sound most prestigious than they are. If you sell your service as a AI prompt writer, you get paid peanuts. If you sell the same service as AI prompt engineer, the C-Suites cream their pants.
So you’re telling me that people advertise themselves as AI programmers? That does not seem like something to brag about in such a manner
Yeah right?
I’ve found it helpful in learning things about languages I’m unfamiliar with, but it seems like saying “I’m an AI programmer” means “I don’t really know what I’m doing in this language, I’m still learning.” Which I suppose shows a willingness to learn, but that’s about it.
Lots of people think that computers are magic box. And now a diffuse entity in the cloud talk to them? Big heads will gobble that shit up.
It is, I believe the correct term is “proompt”
deleted by creator
Removed by mod
“Engineering”
Removed by mod
Yeah, well, like most software engineers lol
deleted by creator
They’re not engineers and they’re too chicken shit to act like engineers.
Looks like an ai did that
HATERS will say it’s fake
If you look close enough, all pictures are fake.
Is that you, Samsung?
Hey, no need to accuse the guy of cutting tvs to get out of honoring the warranty.
And HATERS will be absolutely correct
That’s the joke.
“prompt engineering” in itself is such an embarrassing term for the act of saying “computer uhhh show me epic boobies!!”
like that joke about calling dishwashing “submerged porcelain technician” but unironically
People in glass houses…
Software engineering isn’t engineering.
Yes, it is. Mostly because “real engineering” isn’t the high bar it’s made out to be. From that blog:
Nobody I read in these arguments, not one single person, ever worked as a “real” engineer. At best they had some classical training in the classroom, but we all know that looks nothing like reality. Nobody in this debate had anything more than stereotypes to work with. The difference between the engineering in our heads and in reality has been noticed by others before, most visibly by Glenn Vanderburg. He read books on engineering to figure out the difference. But I wanted to go further.
Software has developed in an area where the cost of failure is relatively low. We might make million dollar mistakes, but it’s not likely anybody dies from it. In areas where somebody could die from bad software, techniques like formal verification come into play. Those tend to make everything take 10 times longer, and there’s no compelling reason for the industry at large to do that.
If anything, we should lean into this as an advantage. How fast can we make the cycle of change to deployment?
I help make Healthcare software. Mistakes can easily lead to death. Not most, but it’s something we always have to worry about.
We might make million dollar mistakes, but it’s not likely anybody dies from it.
I had a coworker who got a gig writing PDA software for a remote-controlled baseball machine. He was to this day the most incompetent programmer I’ve ever met personally; his biggest mistake on this project was firing a 120 mph knuckleball (a pitch with no spin so its flight path is incredibly erratic) a foot over a 12-year-old kid’s head. This was the only time in my 25-year career that I had to physically restrain someone (the client, in this case) to prevent a fist fight. I replaced my coworker on the project after this and you can bet I took testing a little bit more seriously than he did.
I would like to subscribe to your newsletter
You are now subscribed to thathappened! Type oops to unsubscribe
It gives Kerbal Space Program energy.
In many cases this is accurate. Programming alone doesn’t amount to engineering. Lotta low quality lines of code being churned out these days because standards have dropped.
By how some teams operate, and some developers think, there is certainly cases where the “engineering” aspect is hard to find.
It’s not engineering either. Or art. It’s only barely writing, in an overly literal sense.
Making middle management do everything is not ‘running a business’.
If middle management is doing everything aren’t they no longer middle management?
They get middle paychecks.
And vetoed on sensible decisions in favour of non-sensible ones that make the upper management larger bonuses.
mm yes ai
Bro if you could get there just by prompting, it would be.
There are no models good enough to just ask for something to be done and it gets done.
There will be someday though.
Build an entire ecosystem, with multiple frontends, apps, databases, admin portals. It needs to work with my industry. Make it run cheap on the cloud. Also make sure it’s pretty.
The prompts are getting so large we may need to make some sort of… Structured language to pipe into… a device that would… compile it all…
I mean it can start much smaller.
Here is access to a jira board. Here are unit tests. Do stuff until it works.
Perfect! We’ll just write out the definition of the product completely in Jira, in a specific way, so the application can understand it - tweak until it’s perfect, write unit tests around our Jira to make sure those all work - maybe we write a structured way to describe each item aaand we’ve reinvented programming.
I see where you’re going, but I’ve worked with AI models for the last year in depth, and there’s some really cool stuff they can do. However, truly learning about them means learning their hard pitfalls, and LLMs as written would not be able to build an entire application. They can help speed up parts of it, but the more context means more VRAM exponentially, and eventually larger models, and that’s just to get code spit out. Not to mention there is nuance in English that’s hard to express, that requirements are never perfect, that LLMs can iterate for very long before they run out of VRAM, that they can’t do devops or hook into running apps - the list goes on.
AI has been overhyped by business because they’re frothing at the mouth to automate everyone away - which is too bad because what it does do well it does great at - with limitations. This is my… 3rd or 4th cycle where business has assumed they can automate away engineers, and each time it just ends up generating new problems that need to be solved. Our jobs will evolve, sure, but we’re not going away.
I mean, I had beta access to ChatGPT and have gotten excellent results from clever use, so I don’t appreciate the appeal to authority.
No, the jobs are going away and you are delusional if you think otherwise. ChatGPT is the DeepBlue of these kinds of models, and a global effort is being made to get to the AlphaGo level of these models. It will happen, probably in weeks to months. A company, like Microsoft for example, could build something like this, never release it to the public, and if successful, can suddenly out-compete every other software company on the planet. 100%.
Your attitude is a carbon copy of the same naysaying attitude that could be see all over hackernews before ChatGPT found its way to the front page. That AI wasn’t ever going to do XY or Z. Then it does. Then the goal posts have to move.
AI will be writing end to end architecture, writing teh requirements documents, filling out the jira tickets. Building the unit tests. If you don’t think that a company would LOVE to depart with its 250k+ per year software engineers, bro…
lol okay dude. Flippantly you ignored all of the limitations I pointed out. Sure it could happen, but not on the timeline you’re discussing. There is no way within a year that they have replaced software engineers, I call absolute BS on that. I doubt it will rise above copilot within a year. I see it being used alongside code for a long time, calling out potential issues, optimizing where it can, and helping in things like building out yaml files. It cannot handle an entire solution, the hardware doesn’t exist for it. It also can’t handle specific contexts for business use-cases. Again maybe, but it’ll be a while - and even then our jobs shift to building out models and structuring AI prompts in a stable way.
My attitude is the same because these are the same issues that it’s faced. I’m not arguing that it’s not a great tool to be used, and I see a lot of places for it. But it’s naiive to say that it can replace an engineer at it’s stage, or in the near future. Anyone who has worked with it would tell you that.
I firmly do think companies want to replace their 250k engineers. That’s why I know that most of it is hype. The same hype that existed 20 years ago when they came out with designers for UIs, the same hype when react and frontend frameworks came out. Python was built to allow anyone to code, and that was another “end of engineers”. Cloud claimed to be able to remove entire IT departments, but those jobs just shifted to DevOps engineers. The goalposts moved each time, but the demand for qualified engineers went up because now they needed to know these new technologies.
Why do you think I worked with AI so much over the last year? I see my job evolving, I’m getting ready for it. This has happened before - those who don’t learn new tech get left behind, those who learn it keep going. I may not be coding in python in 10 years, god knows I wasn’t doing what I was 10 years ago - but it’s laughable to me to think that engineers are done and over with.
You seem mad and strongly opinionated, but I hate arguing when there is nothing on the line. Would you be interested in a gentleman’s bet then?
My thesis is that we’ll have (or some one will, you and I may not have access) to a form of interactive AI that can effectively code from scratch some kind of large-ish application (like a website), make changes to that website, add features, etc, in the next few years, like, very few.
I’d like to come to terms with you and lay down a bet. If need be we can start a sublemmy to post the bet publicly and we can be held accountable for public shaming if we fail to put up.
For the purposes of a bet, I want to suggest that a code base ‘as complicated’ as Lemmy is a good barometer. My getting this prediction right will be to show you an example of that happening in media, or ideally, being able to show it in use. I think in media should be considered acceptable.
In my circles, we usually make these bets beers or bottles of the counterparties favorite drink, and I’m willing to offer you the following terms: 3:1 in the first year, 2:1 in the second year, and 1:1 in the first year. If the above thesis isn’t confirm, I’m wrong and I’ll make it clear that I acknowledge that I’m wrong.
I would like to bet 12 bottles on my thesis based on the above terms, (where a case of 12 bottles of the preferred liquor or beer or whatever does not exceed $200, so like a 12 pack of good beer or mid tier wine).
Is that a deal you can agree to?
It will happen, probably in weeks to months.
in the next few years, like, very few
Now who’s moving the goalposts…?
There are no models good enough to just ask for something to be done and it gets done.
We call those “compilers”. There are many of them.
Sounds like someone’s worried about how easily replaced they’ll be in the future…
You sound like a class traitor
Realist, maybe. Often a pessimist. Never really a class traitor. Besides, I’m more blue collar than white collar, so I’ve never gotten the luxury of working from home at a higher pay, so as far as being the same class…in the sense of rich vs everyone else, sure.
Your snide comment just seemed a bit too glee about people about to lose their job. Or at least: lacking in solidarity with them.
Forget the distinction between blue and white collar, or higher and lower income: these aren’t classes and the distinction onlyserves toseparateus in class struggle. I meant the “wage dependant class here”.
deleted by creator
Looks like someone is excited about shit content pumped out as fast as computers can munge shit to spit
Nah, that’s going to blow, and I was talking about just that several months ago. The internet is going to be completely fucked, now. It has a nice little run of the golden years from like 1995 through about 2012. Decade after that was all downhill and the last year or so gas been a dumpster fire that’s still getting bigger.
Yeah, writing prompts it’s the long term goal, programming will be obsolete.
Nobody that can write a problem in a structured language, taking edge cases into account, will be able to write a prompt for a LLM.
Prompt writers will be the useful professionals, because NO big tech company is trying to make it obsolete making AI ubiquitous and transparent, aiming it to work for natural language requests made by normal users or simply from context clues. /s
Prompt engineering it’s the griftiest side of the latest AI summer. Look a who is selling the courses. The same people that sold crypto courses, metaverse courses, Amazon dropship store courses…
You sound like you think prompt writer is an actual job man chill out doesn’t even exist
Sounds like science fiction. No proof that it’s useful right now except copy pasta from StackOverflow.
But is “prompt hacking” considered actual “hacking?”
Using an IDE isn’t programming either
But I’ll definitely prefer hiring someone who does. Sure, you can code in Vi without plugins, but why? Leave your elitism at home. We have deadlines and money to make.
Edit: The discussions I’ve had about AI here on Lemmy and Hackernews have seriously made me consider asking whether or not the candidate uses AI tools as an interview question, with the only correct answer a variation of “Yes I do”.
Boomer seniors scared of new tools is why Oracle is still around. I don’t want any of those on my team.
Lol that’s like not hiring someone because they take notes with a pen instead of a pencil.
Thinking AI is an upgrade from pencil to pen gives the impression that you spent zero effort incorporating it in your workflow, but still thinking you saw the whole payoff. Feels like watching my Dad using Eclipse for 20 years but never learning anything more complicated than having multiple tabs.
For anyone who wants to augment their coding ability, I recommend reading how GPT (and other LLMs) work: https://writings.stephenwolfram.com/2023/02/what-is-chatgpt-doing-and-why-does-it-work/
With that in mind, work on your prompting skills and give it a shot. Here are some things I’ve had immense success using GPT for:
- Refactoring code
- Turning code “pure” so it can be unit-testable
- Transpiling code between languages
- Slapping together frontends and backends in frameworks I’m only somewhat familiar with in days instead of weeks
I know in advance someone will tunnel vision on that last point and say “this is why AI bad”, so I will kindly remind you the alternative is doing the same thing by hand… In weeks instead of days. No, you don’t learn significantly more doing it by hand (in fact when accounting for speed, I would argue you learn less).
In general, the biggest tip I have for using LLM models is 1. They’re only as smart as you are. Get them to do simple tasks that are time consuming but you can easily verify; 2. They forget and hallucinate a lot. Do not give them more than 100 lines of code per chat session if you require high reliability.
Things I’ve had immense success using Copilot for (although I cancelled my Copilot subscription last year, I’m going to switch to this when it comes out: https://github.com/carlrobertoh/CodeGPT/pull/333)
- Adding tonnes of unit tests
- Making helper functions instantly
- Basically anything autocomplete does, but on steroids
One thing I’m not getting into on this comment is licensing/morals, because it’s not relevant to the OP. If you have any questions/debate for this info though, I’ll read and reply in the morning.
Your original post referred to wanting to hire people based on the tools they use to do a task, not their ability to do the task - in fact, you talked down to people for using certain tools by calling them elitist. That’s why my pen/pencil comparison is accurate.
Personally, I think caring about that is silly.
I don’t get the downvotes. I’ve hired probably 30+ engineers over the last 5 or so years, and have been writing code professionally for over 20, and I fully agree with your sentiment.
I edited the comment to provide actual info, it was originally just the first paragraph
It’s just the general ai hate. It’s not surprising, because machine learning is yet another scam area. But for programming you would be a complete fool to ignore copilot mastery since paper after paper proves it has completely revolutionised productivity. And it’s not normal to think you will be better than everyone when not using an assistant, it’s just the new paradigm. For starters it has made stack overflow be almost obsolete and it was the next most important tool…
AI’s not bad, it just doesn’t save me time. For quick, simple things, I can do it myself faster than the AI. For more big, complex tasks, I find myself rigorously checking the AI’s code to make sure no new bugs or vulnerabilities are introduced. Instead of reviewing that code, I’d rather just write it myself and have the confidence that there are no glaring issues. Beyond more intelligent autocomplete, I don’t really have much of a need for AI when I program.
This is how I use it, and it’s a great way for me to speed up. It’s a rubber duck for me. I have a fake conversation, it gives me different ideas or approaches to solve a problem. It does amazing with that
The code it spits out is something else though. The code it’s trained on in GitHub means it could be based on someone with 2 months experience writing their CS201 program, or a seasoned experienced engineer. I’ve found it faster to get the gist of what it’s saying, then rewrite it to fit my application.
Not even mentioning the about 50% chance response of “hey why don’t you use this miracle function that does exactly what you need” and then you realize that the miracle function doesn’t exist, and it just made it up.
I use it a lot for writing documentation comments (my company’s style guide requires them), and for small sections at a time. Never a full solution.
Using an IDE definety IS programming.
Sure, you can code in Vi without plugins, but why? Leave your elitism at home. We have deadlines and money to make.
Nothing elitist about it. Vim is not a modular tool that I can swap out of my mental model. Before someone says it, I’ve tried VS Code’s vim plugin, and it sucks ass.
Wdym? Vim is in every ide and notepad man
Certain shortcut keys in vim conflict with shortcut keys in the IDE. The flow doesn’t work the same.
I don’t understand how you think you will convince anyone that you can’t use vim, when so many do that without problems
Please avoid double negatives. I’m not quite sure of the meaning of your sentence.
If you’re saying I have issues using vim if I can’t use it in an IDE, no, that’s not how it works. If you use simple vim (not much more than knowing how to get in and out of edit/visual mode, and use hjkl for navigation), then it’s fine. Once you get into more advanced vim features, though, the key presses in vim get picked up by the IDE first, so IDE shortcuts take precedence.
If someone were to learn vim inside an IDE and develops it organically as part of their flow, it’d be fine. If you already have a lot of standalone vim flow setup in your mind, it’s a problem.