All the good game programmers have left the game industry.
In fact most good programmers in all domains have left the software industry.
They were replaced by an army of turds who became developers because their parents told them to and have zero dedication for the craft and software is the worst it has ever been.
Come to think of it even CGIs from recent movies are absolute dog shit, going 20 years backward in time, so maybe we have a competency crisis not just for programmers.
Picrel "The Downfall Of Minecraft's Performance":
DMT Has Friends For Me Shirt $21.68 |
why get paid a pittance to do something you're passionate about (and have no ownership of) when you could be getting paid twice as much or more elsewhere?
FPBP
Nobody with half a brain goes into the entertainment industry (especially vidya) willingly because they make peanuts, unless they're passionate about that specific medium in which case they do the smart thing and remain indie.
>do the smart thing and remain indie
Going indie is probably the least smart thing you can do. I genuinely cannot overemphasize how little money there is in indie games. Almost all of the money is absorbed by the big hits, or "indie" titles published by big companies.
Going indie is like buying a lottery ticket, only the ticket costs 5 figures and years of your life.
If shit doesn't turn around in 2024 I'm going into web dev. So much easier, so much more money, and I don't have to run a business in addition to trying to make games.
>Almost all of the money is absorbed by the big hits
and overage code monkey gets none of that, it all goes to CEO and shareholders
Code monkey gets a regular salary and benefits.
Indie dev gets literally nothing until their game ships, and if it fails, tough luck. It's hard to live like that. It's impossible to start a family like that.
Who cares if the CEOs and shareholders get a bigger slice if I'm making a comfortable middle class income, when the alternative is actual poverty?
Whining about CEOs and shareholders is reddit-level socialism tbdesu.
frick you
>NOOOOOOOOOOOOOoOOOooO WHY DO THE CEOS MAKE SO MUCH MORE MONEY MANAGING 10000 PEOPLE WHEN MY PRO COMMUNISM PIXELSHIT FLOPS
shouldn't you be posting in /r/antiwork ?
delete this
don't worry, at least you're better than yandere dev...right?
I feel you, bro.
I'm the one on the right.
>it's another day in the dictionary mines
Right now all the logic is done and the """"""only"""""" thing I will have to do is add the text, but I will 100 percent introduce moronic fringe cases that will mess everything up. It's the first game I developed by myself and I dread opening the editor.
I've made sizable improvements with just over a year of dev. I've been programming for 4-7 years but I've made it a disciplined effort.
>improved
underrated
and this is how you get people to leave the industry and leave the just the morons there. for me the medium salary is enough but you'll get lucky getting paid that in the vidya industry
If you're passionate enough about a subject that you want to dedicate your entire life to it, then you would have no interest in relationships or starting a family. Nikola Tesla died a virgin and he wouldn't have been able to accomplish any of the things he did if he focused on women instead.
Tesla stopped contributing meaningfully to the field of physics around the middle of his life (which was around the age he could have gotten a wife and started a family). If you look at all of his later works it's all just useless schizo ramblings and ideas that went no where.
John Carmack has kids.
Gabe Newell has kids.
Linus Torvalds has kids.
Bill Gates has kids.
It's only weird virgins, like Notch and (You), who think forgoing a family to build video games in some sort of twisted mockery of a monastic vocation is somehow commendable.
>John Carmack has kids.
He wasted 10 years at facebook and will continue to do so.
Should have never stopped programming in C.
>Gabe Newell has kids.
Hopefully they can maintain his legacy and the status quo by not selling steam to big corp for billions like George Lucas did with star wars.
>Linus Torvalds
Linux on the user land is a complete disaster, git is unusable for game dev. He caved to the mob and became woke, plus adding rust was satanic.
>Bill Gates has kids.
His daughter married a sand n***** might has well say he has no kids.
The only reason any of those people have kids is because they're filthy rich. If they were just middle class autistic programmers they would all still be 100% virgins.
Notch has a kid btw
successfull aa to aaa all did the smart thing of asset flipping, keeping your shit and improving hit, they usually had success at the second game and could amke morie money with the third or whatever. obviously with small means you cant make it a t the first project, in fact the first should be made to make the second one which is when you ll get recognition if its good
not like it matters since they ll sell to some bigger company the second they hit success hah
True. Only the C rate programmers go into gaming since you get paid so well elsewhere. This wasn't the case early 00's.
fpbp
even if you are a half-decent programmer who decides to go indie you would need the stars to line up to get any traction because all the online platforms are riddled with so much fricking shovelware.
Anyone with even a jot of intelligence goes into fintech so they can earn megabux while finding new ways to ruin the world around us.
fpbp
on a simple opportunity cost basis there's no reason to go into gaming if you actually know how to code. you're getting radically worse hours and pay just for the hope and a prayer of having a modicum of creative control over a project.
>when you could be getting paid twice as much or more elsewhere?
The frick are you talking about? Devs making bank are working the blandest, dullest jobs imaginable.
Protip: money doesn't scale with prowess. It scales with interest.
There's a massive embedded systems and systems programmer shortage (you know, the people who know their shit about computers, electronics, memory management, drivers, signals, timings, and other topics that require you to have a brain).
The vast majority of 'devs' today are just mouthbreather js devs who haven't compiled a single program in their lives.
embedded systems pays like shit thougheverbeit
Not all embedded is the same. Traditional tiny-scale embedded devs are just a shit as the js shitters. They also get paid shit because they live in the awkward space between just getting better hardware that's easier to work with, thus reducing headcount, and having enough volume to justify fixed function hardware.
>There's a massive embedded systems and systems programmer shortage
you may not intend this, but trust me, there are tons of people willing to get into this, the problem is that those specific jobs are gatekept extremely hard by boomers and security clearances.
Let me explain what happens.
Say I am a fresh CS/CPE/SYS Grad coming out of Uni with a 3.0+ GPA.
I now am presented with a choice of how my career is going to go down.
I already know I'm not doing DS, so I am going to be a pro dev.
I already have an idea of my stack, and have learned a language or two while in Uni, and maybe even have a github.
So I apply for any entry level jobs I can find.
Interviews for embedded systems programming are extremely technical and although I have experience with C, they want more than just the ability to malloc and tell them what pointers are. Even at entry level interviews they expect more.
So I don't get any job with that, but you know who will accept me? Businesses using C#, Java, JS, and Python in their stack.
So I get my first job and in 3 years I'm applying again.
There is zero way for me to get into embedded systems because the window to get into it was tiny.
Saying that there is a programmer shortage is complete bullshit. Companies don't want to train anyone so there is a competent company shortage.
>expecting competence is hard gatekeeping
>the window to get into it was tiny
lol a starving drug addict romanian can do better than you and he does it for free
Standards for embedded systems are extremely high for entry level jobs.
This isn't my problem, this is your problem.
There is no competent programmer shortage. There is a shortage in companies willing to train or share knowledge.
I think you mean there is a sabotage in academia willing to give CS degrees away like candy.
No I don't mean that. I mean you boomer fricks gatekeep entry level jobs so that only people with expected experience can get it.
The boomers in embedded systems had no fricking idea what C was, they walked in with an engineering adjacent degree and applied.
>The boomers in embedded systems had no fricking idea what C was
You're fricking stupid. The boomers in a freshman CS class had already spent their whole childhoods programming in BASIC and learning their own niche languages like ML or Prolog. Most were hardcore C fanatists. When it came time to shake hands and do interviews for jobs anywhere from embedded systems to robotics or machine learning they nailed most questions because that's what they got the degree for, actually knowing shit about computers and their function. They would not graduate otherwise.
Your incompetence IS your problem. A training period of 3 months isn't going to be enough to teach your mathematically and logically illiterate ass to do more than what you can. This is never going to change. Somewhere along the way higher education became something you do in spite of self-study, and your degree does not mean you're owed a job. Congratulations for buying in on a scam. Try not being literally worthless next time.
>You need to do a ton of due diligence for entry level
>But we'll also pay less than a JS monkey shop
the monkey shop bubble is bursting in 10 years doe
They said that like 10 years ago.
They also said that Uber was the peak of the bubble, then that WeWork was the peak.
At this point I only expect web to collapse through full automation.
Webdev is easy and riddled with incompetence so I'll assume it's going to be the first thing to go thanks to AI. Or not really go but it's going to be heavily devaluated.
every time i've read a story of how someone's career in tech started, it was nothing like this and instead the dude just walked in and got a job because he owned a computer at home
what an elaborate bait tbh
the future is to make your own games for yourself or in a small team and keep as much profit as possible with collab publishing deals AND nintendo/other homogenized companies
slop devs that work for unstable slop companies or on giga corpo companies like microsoft are the real DOOMED shit in the industry lol
my friend is a tool programmer, has worked for several large developers, and makes well into six figures working from home.
maybe you all suck ass at your job and repeated, endless botches launches means nobody wants to pay you. my friend actually knows how to do his job so he makes good money.
you have no idea what the tech industry is like. competency in tech is very hard for companies measure. you have totally average programmers who repeatedly job hop into better paying jobs, and they call it impostor syndrome when it's really management or HR going off cargo cult metrics. at least half of your success in tech is based around completely trivial and random variables like if you happen to graduate during an industry boom or industry bust, which basically determines your CV for the next 2-3 years, which in turn determines your success and salary for the next 10 years, which compounds for the rest of your career in some way or another
Minecraft never had good performance. Optifine was basically a requirement in the beta and alpha days. The only version that runs well is bedrock and that's purely for not being in Java.
Picrel is a great offender at that
>play 2013 on a literal potato
>flawless performance
>no lag
>constant 60 fps
>2023, on a better hardware
>constant frame drops
>rendering jittery
>limiting your fps via console commands literally makes game a lagfest
>not limiting your fps makes it jump from 120 to 40-50 constantly
>game cant hold up to stable 60 fps in difficult spammy scenarios
I was thinking I'm a nostalgia driven schizoid until one day I tried a fanmade TF2 classic mod that run like TF2 2013 and holy shit we're in the dark ages of gaming
>All the good game programmers have left the game industry.
All the bad ones are what made it up in the first place.
>They were replaced by an army of turds who became developers because their parents told them to and have zero dedication for the craft and software is the worst it has ever been.
>Come to think of it even CGIs from recent movies are absolute dog shit, going 20 years backward in time, so maybe we have a competency crisis not just for programmers.
They were replaced with arrays of tools that did the job for them, easier languages, even game making tools. Stuff that someone with no wisdom can abuse and still create things, such as the infamous man that Nintendo should hire. This combines with the volatility of OS's software that it runs off of in general make it even more of a pain in the ass.
These games are a case of "Never being complete", the definition of bloat itself. They were never meant to run the shit they've added since inception and with each little update adds more and more chances to frick up something that causes even more bugs down the line. The reason they never get patched? Because they have more shit to still add yet, so they'll get around to fixing it...maybe, or kick the can down the road more.
A continous game isn't going to be unplayable with time by default. All they have to do is to keep optmizing the game along with any new updates they make and any new content they add.
The best example for this is Warframe. This game has been getting new content constantly for the past 11 years and it runs extremely well considering how good the game looks even on lowest settings. Up until 2 years ago I've been playing it on potato hardware (gtx 550ti and some intel processor from the year 2010).
Granted, I had to play it on the lowest settings but the game still looked good and didn't have any issues regarding lag unless I was doing an endless mission for over an hour with 3 people particle spamming the shit out everything.
Source is not a good engine for an asset-heavy pipeline. It's arguably one of the worst, which is why Apex Legends had to re-write the foundations of the engine's code to even get it to Alpha.
Valve sucks at programming now. One of their lead programmers got into a fight with fans on twitter saying he doesn't care if a game has a 90% fps drop as long as the fps stays above 60.
Forgot to mention that it was about CS2 on like 4090s and shit.
>has a 90% fps drop as long as the fps stays above 60.
Given frick all happens in Source unless you hit 1000+ fps, yeah that seems reasonable.
Can't really speak so much for Source 2.
I believe that, just looking at CS2. what a disaster.
wtf are you trying to say here?
>I believe that, just looking at CS2. what a disaster.
It's a complete disaster, fricked beyond repair and on so many level i don't even know where to begin, i stopped playing.
same, I uninstalled about two weeks and I'm not reinstalling until I hear that some massive changes have occurred (mostly fixes to performance and hitreg) and all maps have been added back.
it looks broken at the core like CS Source
In the source engine.
Everything functions perfectly fine on the ticrate, your FPS is functionally non-important short of getting past god awful shit like 30fps, it actually starts to have functional changes in game once you hit 1000fps and parts of the game start to break like VPhysics and QPhysics, if your framerate goes up to 1500 basic game logic starts getting weird.
Used to make sourcemods but quit when I stopped using windows since it's annoying to run Source SDK on linux.
Haven't gotten a chance to try the Source 2 SDK because even though Valve has Released
Steam Home
Dota 2
Alyx
Fartifact
The Robot Repair Demo
Desk Job
And a pretty MASSIVE part of the entire move to Source 2 was because of modders on linux.
Valve has still decided they aren't going to release any of the SDK on linux, even though they in fact, DO have it in house and some of the developers at Valve use it.
but what has any of that got to do with frame DROPS? the dev was apparently making excuses for framedrops "as long as it stays above 60" which is a god awful attitude to have towards optimization. and this attitude really shows in Valve games. CSGO had this problem where even while running at 300 fps there would still be individual frames bad enough to ruin the experience. and CS2 can't even maintain framerate, let alone avoid stutter. there are STILL parts of CS2 maps that consistently run at less than half the normal performance. that's dog shit programming.
Because you dumbass as long as it's sticking at or above 60fps it virtually doesn't matter.
serious FPS games should target 240 fps with consistent frame times. i can easily pull that in overwatch 2 even with every visual effect popping off at once. it's actually magic how well they made that game run
>sell 500Hz gaming screen
>frame rate virtually doesn't matter.
maybe the problem is not just on the developer's end
so you only play games with a controller. you don't get to have any opinions about PC gaming. frick off.
that's nonsense. no CPU in the world can run CSGO at 1000 fps, let alone CS2. and even if someone could, it wouldn't lead to DROPS.
>that's nonsense. no CPU in the world can run CSGO at 1000 fps, let alone CS2. and even if someone could, it wouldn't lead to DROPS.
I did a google search and instantly found people managing to go way over 1000 on GO. No idea about 2 tho, you're right.
I assume reducing the windows size and the graphical fidelity would make it easier to achieve.
I believe he implies people complaining might be running the game at way too high framerates to begin with, possibly resulting in those drops
no idea if source 2 has the same problem than what he mentionned tho
It's not really that valve sucks at programming, it's that the people who programmed all those well performing games (base tf2 included) shifted their attention to brain interface and vr development instead. But they're still at valve. I think Alyx performs pretty well, wouldn't be surprised if the boomers were in charge of it.
All the programmers at valve are moronic zoomers now.
>But they're still at valve.
truth is that they all got burnt out from those old days and never fully recovered
I know you probably don't care and just here to complain and I get it this is a thread where we complain about lack of optimization but: I use modified (like enabled eyes) version of this https://drive.google.com/file/d/0B6-BppkGrOVfZ0VBZjlpQWRraVU/view?resourcekey=0-DlSCzo89fg1WK4ipIihSOA and I can get a stable 144 on almost every map and the game still looks like tf2.
I feel crazy every time this thing somehow runs worse. “There’s no way a game from 2007 is running this bad” is what enters my head, and the other thing I’m playing is OSRS so it hits like a truck.
What's up with this? Is it just because source can't handle it anymore or what?
>play 2013 on a literal potato
Not gonna read the rest of your post but I vividly remember the game going to shit in 2010-11 the more updates they poured in.
I started tf2 in 2009 with a Radeon HD 5770 and a Core2Duo E8400 which was a nice budget pc in 2009. The game used to run at 200-250fps+ on release.
Even when they release goldrush and cart pushing shit, the new weapons, adding a bazillion new achievements to check every millisecond for your client. In just 2 years I lost half my frames and it was often I'd drop to 90fps in super crowded maps like dustbowl.
So yeah I don't think TF2 is a good example at all.
he said its a great offender you blind moron
>hire ranjeets
>performance miraculously starts being shit
really makes you think huh. no worries durgasoft to the rescue!
>>play 2013 on a literal potato
performance
>>no lag
60 fps
unlocks had no LODs when they were released, literally the most basic optimization failure possible
TF2 was never good THOUGH
>Team Fortress Classic
I miss it so fricking much
My man
>we're in the dark ages of gaming
I welcome it with open arms you can never go back
What the frick happened to this game, i could run it at high 60 with a pentium and a gt 610. With my current system it makes me sick from how much it bobbles from 20 to 100 with everything at minimum
>All the good game programmers have left the game industry.
You can thank management and the proliferation of moron-friendly languages for that.
I've met programmers that don't even understand how strings work on a lower level, yet work at mid to senior level roles.
You wont get low level optimization out of the people working in the industry now because they have no idea what that even means, and consider it to not be a part of their job description.
>how strings work on a lower level
well, how?
A string is a line of characters.
In C, which is the best way to demonstrate this, a string is an array of characters that ends with a null terminator ' ' to define the end of the string.
When reading a string, the system will simply start at the first character, and continue reading until it reaches the null terminator.
>forgot it's encoded in utf-8
are you one of these bad programmer by any chance?
>you didn't go into detail about how different encoding can alter the size of data and how it's read in this brief generalization about how strings work on a lower level!
I'm not here to do your CS homework for you, Anon.
I've been programming for only 1 year and I already knew this. I wrote my own implementation of the String class in Java. Learning C soon and then C++.
wow good job you fricking moron, you know the absolute minimum about how a string "works". A 1st CS student can learn that in the first few classes. Do you think you're not on the same level of the developers you're complaining about? At least they have experience and know how to handle projects.
as an array of char variables, basically. exact details depend on implementation, but frequently you have a null character at the end of the string to indicate that it's over. so e.g. the phrase "OP is a homosexual" would consist of an array of 8-bit values representing the ASCII for O, then P, then [space] then i, etc. Then at the end, there'd be a a char with a value of 0, usually written as ' ' or NULL. this is called a null-terminated string.
Null terminated strings were a mistake and only C really uses them. Game devs would be using std::string.
Why?
As there is no metadata header, length comparison is linear time. The "simple" string manipulation functions are a pitfall deeper than the Mariana trench because:
- char * is not synonymous with a string, an string is only correct with a null terminator.
- strlen could in theory go through the entirety of your virtual memory space.
- strcpy could end up reading your entire stack if you are missing a terminator.
Never make code that isn't moron proof.
Yeah, I see. I didn't think about it that way. Thnanks.
>ever using non static strings
Processors of more than 16 bits are tools of the devil. I bet you use fl*ating pointss too
>- strlen could in theory go through the entirety of your virtual memory space.
>- strcpy could end up reading your entire stack if you are missing a terminator.
No, it can't, memory protection is implemented within the kernel.
It could only do so "in theory" if you actively modified your kernel.
Attempting either of these operations would result in a termination of the application due to a segmentation fault.
This has been this way since the UNIX era.
The whole argument is on par with "we need to put parachutes on cars because someone could in theory drive their car off a cliff".
Memory protection is a hardware feature. Kernel just sets the handlers and handles interrupts from the MMU
>ESL javajeet doesn't understand what a segmentation fault is
Peak DurgaCore post.
I don't think you know what a segfault is so how exactly is one fired up?
Also never rely on segfaults for overread issues with strings. Remember heartbleed? That was an overread error.
>overwrite null character with some other data, either by mistake of programmer or by malicious user
>program goes to read a string
>null character is no longer there
>it just continually reads data until it hits some block of memory with a value of 00000000
>best case scenario, get back some junk data that causes strange, glitchy program behavior or crash
>worst case scenario, major exploit/security vulnerability
Well then don't do that.
>Simple Dynamic Strings from Antirez are 10 years old (SDS)
>Game devs would be using std::string.
Shitposting aside, yeah, in practice that's the best option.
>Game devs
>anything STD
l m a o
No quicker way to frick over your codebase and compiletimes.
Making a string buffer is literally just a 3-field struct containing a buffer, the size of the buffer and the length of the string stored in it.
Then define a few helper functions to clear/push/print it. C strings are a solved problem.
>Making a string buffer is literally just a 3-field struct containing a buffer, the size of the buffer and the length of the string stored in it.
You could probably get even simpler than that, just store the length and return the length by the character size in a get_size function.
how would you handle different encodings?
Would it not be possible to use something like
>sizeof(my_string[0])
or is there something fundamental I'm missing?
certain encodings (like UTF-8) are variable-length. in UTF-8, some unicode code points are encoded with 1 byte, others with 2, some more like CJK with 3, etc.
t. implemented it in DOS
>CJK
CJK refers to a collection of encodings like EUC-JP, EUC-KR, GB 18030, etc but isn't itself an encoding format.
yes
CJK just means chinese/japanese/korean, i wasn't talking about encodings but just the kind of characters those languages use.
>You could probably get even simpler than that
I don't think so. The size of the stored data and the length are two different things on the machine.
>just roll your own X
NIH syndrome is also an anti pattern.
Anyone who talks about patterns is part of the pseuds destroying software.
>n-no I can't write a simple struct to handle my use-case that even a beginner could write
>instead I should pull in the C++ STD written for 5000~ use-cases, none of which I share, with performance concerns unless you buy into the whole shebang of the STL, string views and everything so that sooner or later you're infested with RAII and other forms of venereal diseases that make you think refcounting is a great idea, pinning your code on a bad GC instead of just centralizing your pooled memory allocations to the systems that need it and letting them handle it
The STD is and remains a pile of shit.
std::string is actually null terminated, and pretty much every other string implementation is too. There is no getting away from null terminated strings since the operating system and all standard libraries use them.
>There is no getting away from null terminated strings since the operating system and all standard libraries use them.
C ruined programming in so many ways
Yesh but how often are you passing strings to kernel APIs? Not really that often. Also it's obviously good practice to make strings in your string library correct C strings so that C code expecting them doesn't get anal prolapse.
It’s a “string” of pointers that point to a character of that “string”
this. the most efficient string implementations are linked lists
There problem is the systems have gotten so complex there isn't any room for low level optimization anymore. Game engines are big and complex, assets are massive and the effort would be disproportionately large. Computers themselves have gotten so complex thst the there are very few people that understand all their quirks around.
>the systems have gotten so complex
big fat lies.
The problem is everything is wrapped in 20 layers of middleware and abstractions. You can't touch anything.
Shitty software existed too 40 years ago (e.g. no integrity checks, uncaught exceptions, poor error recovery, no timeouts, insufficient sanity checks, hard-coded paths, lack of proper feedback, etc.), but we've traded one kind of hell for another (slow bloat, unscriptability, clunky mouse-driven UIs, non-configurable software, software that crashes without even displaying an error message or logs, etc.)
Nah, too many morons. Giving low level access means more chances for them to frick up. Just compare shader compilation on DX11 and DX12
>DX11
>moron proof
>took care of shader compilation
>ran flawlessly on a 4c/4t CPU@3ghz
>DX12
>low level access
>shader compilation is left to the developer
>the "best" developers waste your time with precompilation taking half an hour when you first boot up the game
>still runs like shit
>stutters on a 8c/16t CPU@5ghz
That’s not entirely fair, part of what makes DX12/VK worthwhile is that by configuring shader pipelines for the specific target hardware they can run faster (i.e. not weighed down by abstraction layer), which they do. The first-time stutter is a new CLASS of problem devs are having to learn to solve, because it’s the problem of “installing” shaders per target device. Runtime generation sucks but you’re also b***hing about precompile times
>but DX11 didn’t
And it also didn’t run the same shaders at framerates/res as high as DX12 can, if it could run them at all. That was the tradeoff. You’re basically asking why modern devs, with access to a closer-to-metal framework, haven’t reinvented DX11’s abstraction layer to slap back over top of it, thus eliminating the performance advantages
>but but it's smoother between the stutters
Unironically yes. If you weren’t chronically obsessed with disposable linear cinematic experiences, you’d recognize that PSO stutters only happen the first time you see something until your GPU drivers update but I digress. I’d much rather hitch the first time I see a boss on my first ever playthrough and then get 60FPS after that, than have a hitch-free 40FPS experience.
>Only the first time
Absolutely untrue, it's all the time.
Just stop making up shit.
If you’re hitching on repeat viewings, it’s not shader stutter, it’s the dev fricking up some other way, probably loading GB worth of asset data into RAM every time an object appears
>tells anon to stop making shit up while making shit up
They’re right about how shader compilation works, by the by.
I'm ready to bet money that most of these stutters are caused by improper texture loading / caching since it's easy to get tricked by the driver that the texture has been loaded when it's actually delayed later on when the texture becomes active and is sampled.
So every time you enter a chunk of map that has textures that have not being used yet, and even if they have been loaded, the upload to the GPU will happen all at once.
Driver frickery but the code to properly do it it's not trivial at all, especially if you don't do it upfront.
This is why Vulkan is based. You just upload shit whenever is the best time.
There is still a driver in between my dude, nothing is magic.
Vulkan hasn't solved shit in that regard.
Yeah, but the driver can frick off with this behaviour at least. It does what I tell it to, when I tell it to when it comes to memory.
>Yeah, but the driver can frick off with this behaviour at least. It does what I tell it to, when I tell it to when it comes to memory.
My point is that it doesn't, you still just producing a command list, even if a very low level level one with Vulkan, but the driver will delay the actual texture upload.
Well, yes, but it will only get delayed by other work already in the pipeline, not by some random whim of the driver.
>not by some random whim of the driver
>upfront
I meant early in the project, since it's affect the architecture.
99,99% of game devs have no clue about this.
>be given free rein
>frick it up
gee, it's almost like the problem is incompetence and unwillingness to do a proper job.
While obviously code standards have fallen dramatically for consumer software, expecting devs to implement black magic with pointers and casting like it's the 90s just to squeeze out some extra frames just isn't viable at the current scope of video games.
This. I'm tired of everyone thinking the standards for technology were the same back then they aren't today, they fricking aren't. Everything is hundreds of thousands times harder now than it was back then when they were fricking banging rocks together, and on top of it people keep hiring moronic third worlders to do the work. It isn't fair.
>Everything is hundreds of thousands times harder now than it was back then
And you can thanks the bad programmers who created this unbearable mess
No I can thank the jackasses who advanced technology. Back then you would be called a genius for doing what is basically 2+2=4 today, because that was as hard as it got. It is impossible to reach the legacy of the greats back when this industry was getting its footing because they had it fricking easy.
Not true, bad code can just be rewritten
It's not "just" code, it's a whole stack of shit, layers upon layers of abstract over-architectured garbage.
From OS and drivers, up to Web Framework updating a virtual DOM in a javascript VM within a sandboxed process in a web view unity plugin just so you can have a UI showing latest news in-game.
Programmers back then had to write assembly code for each CPU, each graphic cards and each sound cards that were available on the market.
Now we have multiple libraries which can deal with that.
>Everything is hundreds of thousands times harder now
sure thing zoom zoom. Todays programmers have to write assembly and code 'tricks' to force the 8k of graphics rom to do something completely new and barely possible.
Sakurai coded a game using a virtual keyboard and a snes controller. Keep crying.
It's not viable because the companies want minimum viable product with maximum israeliteery and profit.
This is the biggest reason for stupidity across any industry: they hire 10 juniors to 2 seniors and outsource bulk of stuff. This will never cultivate proper culture of learning.
At the same time "coding" and "graphics" has become a commodity which any hobo can learn in some school but truth is, some professions need a lot of dedication. Not everyone can be an illustrator just because they went to some school...
Saying that the company is just cheap and lazy for not getting OG Doom levels of optimization on literally millions of lines of code is ignoring that they simply could not get enough talented programmers to perform that level of optimization, even if they could afford it. It's also ignoring a bigger reason why games ACTUALLY perform like shit now: intentionally bloated assets to dominate hard drive space.
This, the higherups don't give a single frick about the longevity of what hey are doing and the senior staff is either overworked or discouraged from teaching the junior staff how things ACTUALLY work.
Pretty much every industry has been filled to the brim with people who don't want to create or preserve, only to consume and leave nothing for the people after them. Boomers, Gen X and early Millennials must all despise their own children, that's the only reason this makes any sense at all
Everyone gets greedy when they've got theirs. One day it'll happen to you.
>10 juniors to 2 seniors
Haha as if. For juniors it's pretty much impossible to get into the field for the past 2 years. IT turned into your generic nepotism market just like everything else that pays well.
It's not even going to be a performance gain nowadays compared to compiler idioms. Compilers are much better at producing machine friendly code than most C programmers, especially with instruction level parallelism in mind and vector instructions in mind.
>black magic with pointers and casting like it's the 90s
software is WAY more bloated than you think. Also see "perfect is the enemy of the good."
It's unacceptable for shit like pic related to happen.
Can we talk shit about CS majors yet? Software engineering as a degree is often passed over and roles like working at Mojang require a CS degree. Computer science is basically like
>here's a few introductory courses in some programming languages
>here's how stuff like compilers/shells work, implement a basic one in C
>here's 4 classes on boolean logic
>here's basic data structures/algos you can copy paste online
>ok here's your degree have fun
Most of them lack critical problem solving skills and don't know how to use a terminal or git. It's a nightmare trying to work with them. Meanwhile SWE grads just programmed for a solid 4 years and can actually write code. CS is just academia jacking off to the various theories and proofs that make computers work and testing you on them
When was the last time you ate a kiwi?
3 months maybe? I felt like eating kiwis and had two in a week. Now I'm all about oranges.
Ironically, Minecraft Bedrock is amazingly well optimized.
Java is such a shit engine its impossible to optimize the game, but its too bad bedrock is a shit and all the good mods and community are on Java. The game having to be simultaneously developed for both engines is partially why updates take forever.
>optimize
Sir?
>do the smart thing and remain indian
Sir?
>amazingly well optimized.
Sir?
Do the needful sirs
mindbroken
Java isn't an engine...
Obviously he meant Java version you dimwit
considering the average opinion about java on Ganker is
>java is a bad language
>it just is, okay???
>my qualifications? uhhhhhh
I don't think that's a safe assumption
Java is a bad language, one of the easiest to manipulate. Gj moron
Java is great for some tasks, but it's not very performant which is why using it to make a game is moronic.
It's not as performant as C or C++ but it's not as slow as Minecraft would have you believe
Initially left bedrock to a Jens as a sideproject, then they realized mobile had more sales than any other version and send their C-team to work on it, thinking that the fact of it being written in C++ would be enough to outweigh the shit code they wrote for performance (they were quadruple rendering tree leaves until a few years ago). Now redstone's a massive mess, they'll never add the combat update because the PVP "community" on mobile would seethe that they can't just spam to win (and touch controls make shields awkward). They also outright removed symbols (despite the community as a whole saying not to, since that's what mods need to work). So now you have a buggy soulless cash grab with no mods ever besides their shit "API" where you can only reskin stuff and make custom models. It's a shame because moddedmc is some of the most fun I had in a game but it's a resource hog on java
play the switch version of bedrock edition and tell me that
Well the Switch is just dogshit all-around so that's not really any devs fault their games don't run well on it. Nintendo's own games don't even run well on it.
if you play Nintendo games on a Switch in instead of PC you're not a fan, you're a drone.
I wouldn't say "amazingly", one of the bigger 'optimizations' is making no attempt of game logic parity which leads to fundamental problems like nondeterminism.
No, microsoft just hire cheap soulless labour.
See 343i, the coalition etc.
Competency crisis doesn't exist. The only crisis is actual important work is being outsourced.
Well it's a videogame so who cares about the performance as long as 9 yo kid can runs his own little farm
Why bother rewrite the game to make it multithread when the effort could just goes to adding more bullshit
Isn't that what most people want?
Minecrafts problem was that Notch never wrote the code thinking it would be built on for years. It was only supposed to be good enough for the base game. It must be near impossible to work on whatever spaghetti code hell it is today
It's unlikely any of Notch's original code is still in the game by this point.
Dont you know how coding works? They build off it. They didnt invent a new game by scartch
Anon, Minecraft's engine itself has been heavily rewritten over the years. This includes (but is not limited to), MULTIPLE rewrites of the lighting engine, rewrites of the rendering/chunk culling, rewrites to the save format several times, rewrites of their NBT data format and implementation, just recently the networking packets have been reworked entirely (although it's not yet clear why). Nearly every engine-level aspect of the game has been rewritten several times now.
I'm not saying they didn't, but why the frick all the amateurs keep calling everything a fricking engine? This is wrong. Minecraft has not a single engine in it.
Just arguing semantics here, Minecraft (Java) may as well be classed as its own proprietary engine using the LWJGL. AFAIK, Bedrock is built using the Renderdragon render engine instead.
>Just arguing semantics here
Yes, but no. It is semantics if you're a casual, it's a no-go, if you're in a professional environment. If people calling anything generating a light source, playing a sound, etc.
>insert game name engine part of the engine that was developed in a x engine
it's no fricking wonder that the garbage the gaming industry produces sucks sooo much.
to be honest, I don't care and I am not a programmer it isn't my job to know the semantic difference between what is or isn't an engine.
I hope that you're not in the IT industry in any way. If you're a carpenter, that's fine with me. It doesn't even have to do with the programming directly. It just different kinds of things. For example
something allowing you to use a specific israeliteVIDIA technology in your software development is NOT an engine, BUT the israeliteVIDIA thing itself MIGHT be an engine, depending on what it does. Importing a simple data library is NOT a fricking engine.
I'm neither a programmer nor in any IT position dude. Blaming random anons for modern game code being bad is a really weird stance.
I'm not blaming your individually, but incompetent people being in the industry at large is one of the core points of the thread.
It sounded like you were blaming me, as well as that other anon
>data library
what is a "data library" now?
>data library
it's a library that contains data. which is every library in this universe. No, I'm not talking about a database. I'm talking about data in form of a library. which you might generate during dev. for your piece of shitware.
I'm not. I'm just pointing out, that terminology is always important. You shouldn't get defensive, you should learn instead. Which is one more problem we have with modern work environments. Nobody wants to learn, everybody is sooo offended all the time.
As George Carlin put it a couple of decades ago:
>we don't have stupid people anymore
>everybody has a learning disorder
>You shouldn't get defensive, you should learn instead
If you could explain why Minecraft doesn't have its own engine, that'd help.
I don't know that. But I'd assume, that Minecraft just makes use of the other existing engines that are free to use. I highly doubt that Notch programmed his own renderer from scratch.
So, it's like the idea of the game = code of the game in java, that imports and makes use of other technologies created by other people/companies. Which is what most game devs do, even in AAA.
it has a situational use, if you need to dump your data into a file that you can easily modify or open to modification later. plus, it reduces clutter. plus, it can be split into multiple files. versatile, yet ogranized.
>I don't know that. But I'd assume, that Minecraft just makes use of the other existing engines that are free to use.
Now I'm confused, are you not
?
I am. And the logic follows. I don't know for sure, esp. now, with over9000 Minecraft versions. But the original Java version just made use of other stuff.
>Minecraft has not a single engine in it.
followed later by
>Minecraft just makes use of the other existing engines
seems contradictory to me?
you were talking about a game archive, not putting data in a data.lib ? like a WAD file in DOOM/Half-life?
Very confusing to use "data library" terms just when people are talking about libraries like lwjgl.
Game archives are terrible for binary differential update, and that's the main reason why game update are so huge and slow nowadays.
Don't reinvent the file system, absolutely avoid them.
Why would you put data in a library when you could just load a binary file at run time?
And you are the one saying "terminology is always important", geez anon.
>Why would you put data in a library when you could just load a binary file at run time?
NTA but it makes it easier to reuse and redistribute code.
I'm pretty sure they're referring to something like a DLL or SO file.
>I'm pretty sure they're referring to something like a DLL
yes.
>put your data in a dll
No wonder why memory footprint of most apps are >500mb
>Minecraft has not a single engine in it.
>Minecraft uses lwjgl which is a 3D Java engine.
who is the amateur here?
>lwjgl
Lightweight Java Game Library is NOT a fricking engine. The clue is even in the fricking name: LIBRARY. mfg, I can't even. No wonder software sucks donkey balls so hard.
Uh anon the developers themselves have talked about minecraft's "lighting engine", referring to it as such. What the frick are you calling it?
Yes, minecraft has its own lighting engine, and as that anon pointed out it has been rewritten often.
Minecraft have a marketplace store with DLCs and cosmetics, but the main game is crap. They know his priorities
I don't care what Bedrock Edition is doing.
A lot of notch's code was rewritten though since many of the original features he added had to be updated or changed.
But they did, and noone uses it, bedrock
But mostly because its predatory eshop ridden garbage
Had the big mods migrated there to force people to play and made mods to swat away the microshaft bullshit it would be a great platform, you could even run modpacks without allocating 20GB of ram
>The Day Before I Knew Programming
what am i looking at
skibidi yes yes
Blame diversity hires and pajeets.
Factorio
>7 years to code a 2d game that is specialized in pretty much one single thing
Still has room for optimizations, actually.
Idiot trying to make a game using Godot here:
I have a menu in a box attached to a node that's a child of my main node. So I'm passing a signal up from the menu, to the node to the game logic. Am I being moronic or does doing it this way make sense?
take a screenshot of your node and code, then state your problem properly
Am phone posting so no screen caps. Code is basically just
Func [function name]
Emit signal([signal name])
Three times.
You can have a node at the top level hierarchy that is completely separated from the visual stuff, you could call it a UIManager or something, with a script that declares and bind all the other (sub-) nodes & signals with all callback code.
My point is your script can be completely separated from the (visual) node hierarchy.
Having managers is also a good practice with other engines such as Unity.
Passing around signals like that is not really ideal for that.
Use an Event bus singleton.
UIEvents.something_happened.emit()
Thanks, will look into changing that tomorrow.
Just looked up a tutorial and this was exactly what I was looking for. Thanks again, anon.
Honestly, don't be too paranoid about coding practices in hobby projects or learning projects since the entire point of them is to teach you the basics by trial and error. Once you have figured out how all of the individual parts your using works you can start diving deeper until you reach a point where you are no longer struggling with just getting shit to work the way you want it too
Yeah, I know my code is shit, but at this point I don't really know why my code is shit yet. Since I know this was one place I knew I was likely doing something wrong I figured since there are knowledgeable anons I may as well ask about it.
Signal up, function call down
You people won't want to hear this but it's because game prices are not keeping up with labour costs
>game prices are not keeping up with labour costs
>makes 40 millions a month
this is a thread about videogames, not jpg slot machines for mindless cattle
every modern AAA have integrated some slot machine elements into their game
they really don't. there's a lot of AAA gaas and lootbox machines but it's not all of them
very close to all of them though, except many Elder's Cringe
maybe Elden Ring*
nah you have the ubisofts in the industry but there's still a lot of games that don't require monetization afer buying the game. at worst some of them have paid DLC and cosmetics but it's not like you're "gambling" on getting those with lootboxes and trading money/farming battle passes for shit, but even then they are still complete packages that you can ignore these most of the time
the neckbeard anime-loving network admins pockets are pretty deep. And that cash sure as hell isnt going towards women, international travel, or gym memberships
>open up jobs to brown people and women
>guys why are the games getting worse?????
don't forget the soiboys
It's fascinating how the publishers and dev studios that promote how "woke" they are the hardest, are also always notoriously terrible workplaces. Really makes you ponder.
>create laws that limit number of white males in any industry
>everything goes to shit
Dont google the story behind the 737 max
>not knowing the difference between tps lag and java memory dumps
Pot calling kettle black.
by the author of
>it work on my machine
shitty ass programmer club presents:
>between the frame drops it's pretty smooth
It's literally the fault of fans.
You were told not to consoom slop and vote with your wallet but you didn't. They kept lowering standards and you kept buying, letting them know it's okay. This shit is still going on, too.
This all happened because programming became more and more accessible.
In the early days (80s, early 90s), programming was for nerds only. There was no stack overflow to copy code bits from. There was no Intellisense to babysit you as you code. You read your books and manuals to learn new things and memorized or wrote down algorithms you invented.
The 90s changed everything with the Internet and easier languages like C. The threshold was lowered from nerds to smart people. You no longer had to have a passion for programming to make it.
The 2000s lowered the threshold even more with high-level languages like Java and C# entering the fray. Now you just had to be above average intelligence as a programmer.
The 2010s were the turning point. Stack overflow became a thing, lots of IDE extensions to make programming as mindless a task as possible. Any midwit could become a programmer from then on.
And now we're reaching another turning point with language model AIs. People say AI will replace programmers, but it's actually just the bar being lowered yet again. Now you no longer need to even really understand a programming language. You just need to know what prompts to give to the AI and how to fit the different code bits it gives you together. So anyone with a functional brain can do this job now.
Funny thing is, programmers were always on board with making their jobs easier. They gladly gave their knowledge away so it can be used against them.
Your post is very superficial. Skipping languages and decades. Programming started way earlier than 80's. I'd argue it started before the 20th century, but whatever.
>Funny thing is, programmers were always on board with making their jobs easier. They gladly gave their knowledge away so it can be used against them.
Not really, it depends on the level as you point out. I imagine the lowest level math nerd programmers are still making bank at chip producing companies.
Also, giving simple commands to the machine back in the 50's wasn't hard either. It was just tedious and you had to pay a lot of attention to NOT put an extra/wrong symbol anywhere, because your entire code would go breasts up in that case. But things like assembly language, for example, aren't hard to understand at all, if you read it.
TL;DR: We're for the simplification, because it became too complicated in the first place. Too abstract, you can't always simply follow the logic anymore.
it's already long enough without diving into other languages or ancient history
that, and I'm one of the 2010s midwit programmers so I don't have first-hand experience of what went on in previous decades.
but my point still stands. more accessible profession = lower average quality professionals
I've been programming since I was a kid with actionscript on flash and then working in Java and C++ as time went on. When I finished college during COVID I couldn't find shit for a job despite having over 10 years worth of coding knowledge and programs while these companies brag about how they hire some random fricks that know nothing about programming or computers or anything and now they start going to shit. Their downfall is well deserved
This thread reeks of nerd sweat. Im outta here to have sex in my yacht.
I'm a software developer. I make bank working for a bank (hehe) and work very confortable hours from home. I would never work for a game company.
real programmers do real jobs like creating internal tools and flows for their corporation. game developers are just trannies who couldn't get a job
>oppinions on programming
>on Ganker
lmao
you have no idea how to program, every single employed dev is better than you
So, did anyone in this thread actually finish watching a video in the OP post? In the end he installs fan-made mods that increase FPS back to 120 and remove any stuttering. This is not a Java problem, this is Microsoft problem.
All these performance optimizing mods make compromises with the rendering quality/consistency. Yeah, I used to use optifine (nowadays use Sodium+Lithium instead as it performs better in post-1.18 for me), but I notice the discrepencies often versus the Vanilla renderer. 1.20+ performance is good enough on my machine that I don't need performance mods at the moment anyway.
>run minecraft on 2gb ram
>"woah this optimization really sucks"
>run minecraft on recommended 6+gb ram
>"woah based modders fixing minecrafterino!"
Minecraft devs deserve a lot of shit for how slow and small their updates are but the problem in this case lies between the keyboard and monitor.
this shit should run 120 fps on 15yo dual core hardware, rendering cubes is the easiest possible use case of gpu instancing.
most of them also frick around with other aspects, mostly mods. Installed once a performance mod that did really give extra frames (even more than optifine) but it also made my character violently contort every time I tried to swing a sword.
WHO FRICKING CARES ABOUT GAMES BEING OPTIMIZED
THEY RE FRICKING UGLY . COLOURS ARE SHIT. CONTRAST IS SHIT, REAL TIME LIGTHNING ALGORITHM ARE DOGSHIT, EVERYTHING RUNS ON UNREAL OR UNITY. IT DOESNT EVEN MATTER
STFU
Sorry that not every game is brown&bloom anymore, but you really need to get better tastes.
Every single game looks like shit thereisnt one game with good lightning, without ugly physically based shading, not a single fricking one
all there is, is very FEW games who are less shit than others
It's not feasible to render accurate ray tracing in realtime. Even games that advertise themselves as having "ray tracing", or RTX, or path tracing (a less taxing method), are typically partial and noisy implementations since it is still too computationally taxing to run in realtime at higher fidelity, especially where global illumination is concerned. Modern computers still don't quite have the horsepower necessary.
anon could you give us the top 3 of "old" games that have the best lighting according to you?
every single game. they didnt have weird washed out real time lightning, they didnt oversaturated to compensante for the washed out colours, they didnt have physically based lightning that make everything looke grey white. every fricking game
look at this interior
?t=302
its not fancy, but its neither washed out nor popping, its just easy to the eyes, every fricking game was like this and now all of theem have either unity shit where everything is white/greyish or unreal where they used algo that were made for artificial environement with artificial lights and they use it for natural settings with sun sky and grass outside etc
I had wish that you would have provided better examples such as Mirror's Edge, Final Fantasy 13 or Plants vs. Zombies Garden Warfare 2
terrible example, it looks like a Direct X demo from the 2000s, are you trolling?
im not trolling at all you re the one that is either blind or with no sense of colours at all
> Direct X demo from the 2000s,
yea and it looàked good, now the colours are destroyed by the shitty realtime lightning and maybe artistic choices by noob devs
I actually agree with you anon, I just don't understand why you would pick such example honestly instead of
Lookt at the lightning on the left of this
?t=2201
This is typically the kind of garbo unreal lightning that used to pollute every game, but now they found other ways to make things look like shit, they used to remove all contrast and colours, and now they start to compensante for their washout shit by making stuff look too popping almost like there is no general lightning affecting some stuff,, making it saturated or giving it a strange colour that makes it look weird and more popping like the weird yellow green ish stuff you d find on the plants in this AC or Witcher 3
or this blue hue that looks like everything happens during the early morning
?t=900
Zoomers will never experience a time when developers did lighting by painting each vertex by hand.
?t=761
>PBR & global illumination Engine Programmers VS Level Artists
>Procedural Animation Programmers VS 3D Animators
This is what happened at the edge of two disciplines with opposing forces, and since they work in separate teams (and studios) at Ubisoft it was doomed to end up like this.
yeah instead of brown and bloom we get washed out teal and volumetric fog
>REAL TIME LIGTHNING ALGORITHM ARE DOGSHIT
i'm absolutely 100% with you on that one anon
That's part of the problem. When new games look worse than old ones you don't expect them to run worse too.
>complains about performance
>quad core cpu
you gotta be trolling, you need an octo-core to run a game made of fricking cubes?
More cores wouldn't make a difference with minecraft.
has pixelmon been updated lately? is it worth going back to this game to enjoy some comfy building and catching pocket monsters?
>genuinely asking because last time i played the inconsistent framerates and tick spikes gave me headaches
There's even a new contender, Cobblemon
It's because people are dumb. You don't have to be smart to be a programmer, yes that includes you, anons who read this and think it's difficult to learn. The hard part is being a GOOD programmer.
Indians are very bad at optimisation and compression and why every western game is 500gb and run terribly on PC
>4 core (probably 2 cores 4 threads from the U family)
>some laptop gpu
>RAM at the lowest speed possible
Yeah, sure, it's the programmers
Half-life / CS were running at 120fps on CRT screen back in the early 2000s. what's your excuse?
Post speccy so we can laugh at your 10 years old multimedia laptop turdie.
You get lagspikes like that when the game has to load in new chunks because you have a shitty processor.
For Minecraft you need really good single threaded performance.
no one on Ganker uses laptop for gaming, just stfu already
Post specs
i do :3
$3000 for a mediocre sub-par gaming experience
I'm also trans btw
A total of 108 Playstation 1 games were running at 60fps by the way
>mindblowing blazing hardware in the world history
>can't even hold 60fps, "muh just just performance mod"
>performance is 30 fps
I'm looking at you UBISOFT
>blazing fast*
>console
I mean, to be completely fair, you're going to be working with pretty limited hardware regardless of what the marketing says.
Sure that means they should just tone down the visuals further on performance mode.
>limited hardware
i want to slap your your face with CPU pins
look out anon that's a CPU made for an LGA socket you're holding
>16GB VRAM shared between CPU/GPU
>Series S has a GPU clock on par with a 1080TI, with the series X just barely breaking over the average 1080TI OC
>8 core Zen2
I mean, it's not BAD, but it's pretty far off from top of the line.
>>16GB VRAM shared between CPU/GPU
shared VRAM is unironically far superior
>shared VRAM is unironically far superior
Not to the point where it makes up for a much more restrictive memory budget.
A mid range system now usually has 8-10 GBs of VRAM and 16-32 GBs of system memory.
Being able to access the same data on both the CPU and GPU can be a plus for visually demanding games with a lot of visual data that needs to be loaded onto the GPU, but for actual logical/graphical processing, it's basically useless.
Do you hear even yourself though?
Even for worst case scenario of 100GB sized game, with 16 Go of VRAM that 16% of the game loaded in ram.
Don't try to "compression" me, not a single game developer compress their textures nowadays.
So what kind of moronic programmer need to load 16% of the game at once in VRAM? And that estimation includes de-facto streamed stuff like cinematic video and audio which are never full loaded anyway.
You don't need to load 16% of a 100GB game at once in VRAM.
Completely and utterly stupid.
>You don't need to load 16% of a 100GB game at once in VRAM.
reminds me of those games that weight 4 GB and demand a minimum of 16 GB of RAM and a recommended of 32
Well, compression still exists.
>compression still exists.
i doubt it as they always are some pixel/voxel based indie jank with horrible textures and an abuse of prefab unity/unreal effects and filters
Unity compresses things by default. I'm guessing a lot of jank does not change defaults.
There are also a lot of other things in the RAM other than assets.
latest example
https://store.steampowered.com/app/1159290/The_Bloodline/
>falling so easily for the marketing israelites
The 1080TI is a dated card that is nearing the end of it's lifespan due to evolving graphical technology and it's growing inability to support certain API calls.
t. 1080 TI owner
>he felt for the Vulkan and Ray Tracing meme
there is a shortage of graphic cards but they are no shortage of gullible consumers
what did the totally not an amd owner mean by this?
>i bought the rx 580 in 2019 and now the 1080 in 2023 and dont regret my snails pace upgrading because EVERYTHING fricking supports the 1080
it's limited but you're working for one single specification AND you have the resources to know ennough of the internals of the hardware to optimize for it. it's genuinely mindblowing how bad the industry is right now
I feel like this is sort of schizo but I feel like Mojang is intentionally killing Java's performance so they can ween them off an onto the Bedrock edition aka the P.E. Microsoft uses this tactic quite a lot.
I've been saying this for 4 years. The 1.13 (the most shilled update) Flattening and other "improvements" really ended up raping performance, so much so servers would not update to 1.13.
that's not really true. there are definitely some turbo autists left. mojang is more of a joke developer you know.
Java is a dogshit language that the earlier it gets phased out the better
Too bad India and corporate thinks otherwise
>there are no good programmers because fricking Minecraft is poorly coded
Are you forgetting how it was such an unfixable garbage fire they remade the entire thing? It's not their fault that nobody plays Bedrock. Well, it is Microsoft's, but not the programmers.
white men having been losing their jobs non-stop for the last decade and been replaced by brownoids and women
>In fact most good programmers in all domains have left the software industry.
that's only half true, the "good programmers" are working 2 days a week for $200k + stock and doing woodworking/gardening most of the time.
the issue is too many people want to work in the game industry and people who are actually best suited can make more money elsewhere. so you're left with overworked, underpaid, under-qualified workforce.
>Come to think of it even CGIs from recent movies are absolute dog shit
this is more of an issue with contracting and the nature of the industry, basically a bunch of contractors bid on price to make CGI for a movie, lowest bidder wins and it's usually some shithouse company, studio doesn't care cause they know most people wont notice/care about CGI quality, basically the bar is low because bottom of the barrel is fine for consumers.
So how do you become a "good programmer"? You make your own engine?
Learning about memory access optimization will put you above a lot of people:
https://en.algorithmica.org/hpc/
Studying the hardware you're coding for will make you optimize certain operations, also learning enough math to be able to apply algorithm from smarter people than you helps. But in the end, there's no "good programmer", you're always learning.
I KID YOU NOT MY DUDE I ALREADY HAD THAT LINK OPENED AT CHAPTER 2 WHEN I READ YOUR REPLY, AM I GONNA MAKE IT?
Obviously, yes.
Make a game with no engine, just add the libraries that you need and code the damn game.
Make a simple editor to place objects around, code a console, and few command line tools to bake the data.
You write actual code, think about if it achieves what you're looking to do and what it actually does on the machine to get there and don't get bogged down in outdated industry dogmas like OOP, TDD and other nonsense.
generally just learn programming for a while
learning maths,
learn why/how inner workings of processor/memory work like they do
from there you basically just read books/blogs from other good programmers and think critically about what tradeoffs they're talking about.
>learning maths,
does this include calculus, combinatorics, algebra (equivalence classes and relations and other basic shit), graph theory, etc. that you cover in college?
I know people that will tell you to do everything, but I basically just did a udemy in calc,
but mostly you just want a foundation of everything (if you're starting from 0) . because you'll end up wanting to write some code that does a specific un-google-able thing, and you'll know what it has to do, but without the math you wont understand how to go about it.
Learn data structures
Learn C. Learn how to avoid the most common bugs in C and why they happen. You are now a good programmer.
>programming and CS used to be work for nerds and enthusiast who were passionate about computers
>programming now is a career path for every NPC cause of good money
go figure
>go figure
NOOOOOOOOOO BUT MUH MERITOCRACY
Games became more and more complex, so now there is a split between engine creators and game makers. If you want good programmers, you will have to look on the engine side. Game devs are merely playing lego with code bricks nowadays.
>became more and more complex
game are braindead compared to how they used to be
Where would one learn how to program games "the right way"? Is it really just reading through the entire Doom and Quake codebases by Carmack?
Learn how to code for a case where good practices are mandatory, not an afterthought. Medecine or finance sectors for exemple.
>Medicine
>good code practices
I think he means the equipment rather than the sites and forms
therac
Carmack's game code is one good start. (Sanglard's game code review also helps)
Handmade Hero isn't a bad start either.
Reading blogs by people who actually care like the Handmade crowd. (probably the only way you'll get a handle on actually good C code and allocators without drowning in decades of trash)
But really as long as you're not buying into industry shit like "modern" and "standard library" anything and you're already ahead of the curve.
Curious about this. What about the recent old game decompilations and reverse-engineered codebases? Like OoT,
Anon no most of those old games were held together with ducktape. Look at what Kaze has been able to do just by reverse engineering Mario 64, those games weren't optimized at all.
No, I mean taking a look at the fan-made ports from the reverse-engineered code.
>those games weren't optimized at all.
And yet they are still way ahead of the absolute trash we get today even from top AAA in term of stuttering, freeze, lag, popping, clipping and whatnot while they use already-made engine like Unreal or Unity.
No fricking excuse.
And Mario 64 was made under two years at the time when 3d was a brand new thing with no engines or tools whatsoever available for N64.
>And yet they are still way ahead of the absolute trash we get today
-skill floor was a lot higher
because
-tight hardware limitations meant they had to keep things sane
-way smaller team and simpler technology keep things simple (no 8K models with a gorillion poligons and particle effect vomit on the screen, etc.)
-way more limited scope
-no diversity hires who gamed their way through interviews
-etc.
Decompilations are someone else's guesswork of the code.
It's usually not well-laid out or easily understood and may even include auto-generated stuff.
But YMMV.
I reckon blogs should only be read by people with enough understanding to disagree with them, too many juniors read a blog and assume it's gospel.
unless you have someone in your life who can point you at good ones.
Play a few thousand hours of Factorio and you'll have a far better understanding of what makes a program run well than someone fresh out of formal education.
>Eat a few thousand times at one star restaurant and you'll have a far better understand of what makes a 3-stars chef cook well than someone fresh out of formal education
>Hey this one smart little change can make the whole system way more efficient rather than doing all this dumb shit I was doing before
THIS is what's missing from nuProgramming.
>been doing front end an hour a day
>it’s not enough
I’ve no passion to spend any more time on it.. can’t I just play vidya for a living?
>thread devolves into mustardrace homosexuals inadvertently defending how fricking terrible mojang is making their fricking baby block game run
Never change, Ganker.
>Learn C
>Learn code good
>Nobody is hiring for C
>It's all cloud shit running on dogshit tech stacks split into a million microservices using python or javascript as the backend language
See? Yet another casualty proving me right
the skills are easily transferable, language is the least important thing to learn.
>Nobody is hiring for C
All of embedded and plenty of C++ jobs overlap.
at the end of the day people went for convenience over quality
dont know what its like where you live but here you probably wont even get through the automated CV screen unless you have several years of professional work experience with the language you are good at. it doesn't matter how "good" you are at it, that's something HR can't measure, whereas they can confidently think "10 years working at Shitsoft? this guy must be a 10 time C genius!" even if all you did there was bring people coffee
>UMM, AKSHULLY, MY SIXTEEN CORE 2090TI BATTLESTATION PLAYS THE GAME AT 78 GIGASHITS PER MEGAFART WHICH AMOUNTS TO A PRETTY SMOOTH 60FPS ON THIS PIXEL GAME FOR TODDLERS
Didn't Ganker get buttblasted when Todd told them they need better PCs? What the frick happened? I guess minecraft is the end all be all of hardware demanding games now.
who are you even quoting, are you AI bumping the thread?
If a few mods can make the game's performance miles better why doesn't Mojang just buy those mods from the creators and throw them in the official game release?
>Hey troon, give you 100k for that mod of yours
>Damn, sure
And boom, the game runs better with zero in-house work required.
They tried this with the optifine guy, but he refused.
Mojang has actually hired plenty of modders, but you really wouldn't be able to tell because very little ever comes of it.
Once you're in mojang, you become mojang. You might have pushed out mods that double the games content and add mechanical depth reliably beforehand, but now you'll be spending a year to add a trans bee and remove the Ganker splashscreen.
The end result of the optifine guy joining mojang would've been the game still running like shit, but now optifine would no longer be released.
>give you 100k for that mod of yours
in minecraft*
I dropped out of my CS graduate course.
It really is a travesty that bedrock is basically just the microsoft approved babyfied microtransaction simulator. It actually runs great, has better render distances than most mods that increase them (by a wide margin), you never see chunks load in, since they blend in organically.
>It really is a travesty that bedrock is basically just the microsoft approved babyfied microtransaction simulator
their game, their tos, their rules. don't like it? make your own
not him but
>we changed the rules so you cannot own personal items anymore
>their state, their laws, their country
>dont like it?
>make your own
tl;dr
you're a cuck.
>noooooooooo I can't believe the corporation did something completely within its rights. this is hecking robbery!
the game is not yours. cope gamer manchild. make your own toys next time
>corporations monotozing every aspect of a game is fine bro, just make your own
what a pathetic b***h
you twist and bend to make a pointless argument you made up to justify "your stance" here, even though we can tell it's completely fabricated
you're a dumbass and a cuck for pretending to like the taste of corporate boot
stained_souls, featuring (You)
homosexual 🙂
Informative thread. I have no idea what any of it means, but good thing there's people that do
yea
>They were replaced by an army of turds who became developers because their parents told them to and have zero dedication for the craft and software is the worst it has ever been.
well, i guess i'll be another one on that pile.
inb4
>just do what you love!
i'd rather be a writer, so either get me a shotgun to shoot myself with or stay seething
i don't know jackshit about programming but i think devs should just make some good fricking games
Minecraft is a terrible example because mojang has been made of completely incompetent people since its inception. The fact that they never re-wrote the game before MS in something that is not Java after all these years and money speaks volumes.
Be the change (You) would like to see, then. You can do it Anon! I really do have faith in you.
You could become a great gamemaker /programmer if you tried!
If you can't make a good board game, you don't stand a chance of making a good video game. Go get a deck of playing games and make a game out of them.
Because you don't need good game programmers in the game industry.
You've got so much crutches that you can make games without writing a single line of code. Hell, there's nothing impressive about code, unlike with art and music and writing and everything. Programmer's job is to do the bare minimum.
>Hell, there's nothing impressive about code
x2 = number * 0.5F;
y = number;
i = * ( long * ) &y; // evil floating point bit level hacking
i = 0x5f3759df - ( i >> 1 ); // what the frick?
y = * ( float * ) &i;
y = y * ( threehalfs - ( x2 * y * y ) ); // 1st iteration
// y = y * ( threehalfs - ( x2 * y * y ) ); // 2nd iteration, this can be removed
Yeah yeah, we're not in 1986 anymore grandpa
>"wow this scenery is beatiful, who is the artist?"
>"wow this music is banging, who is the composer?"
>"wow this character's story is heart-wrenching, who is the writer?"
>"wow this box-pushing puzzle runs so smoothly, who is the programmer?"
one of these is not like the others
>>"wow this box-pushing puzzle runs so smoothly, who is the programmer?"
the answer is always, ALWAYS jon blow.
I have unironically looked up and read blogs of developers for games that I deemed technically impressive, like factorio, teardown or deep rock
If anyone looking for the source of this, search for "fast inverse square root" in google, it's from quake source code.
Don't use it though, it's obsolete.
specifically the "what the frick?" comment is from john carmack, which is like some soundcloud rapper getting mick jagger to shit his pants in awe
Kaze put it in Mario 64 like a month ago.
make sens given that N64 was released in 1996
i gave up learning programing when i saw chatGPT could write code in any programing language in 10s it took me weeks to learn, it might not be good yet but it definitely will kill 90% of code monkey jobs in next decade or so
Pretty much and it's already killing my existing job. I'll probably just exhaust my savings and blow my brains out once jobs evaporate.
OOP(oop) was a mistake.
C++ is OOP is and it's arguably the fastest language for games.
The problem with oop is that it abstracts and simplifies things too much. So now you have 30iq smooth brains who have no idea what their code is actually doing inadvertently making spaghetti code, or code that is simply not actually optimized. And no, telling the compiler to magic away your shit stained code is not going to make your code truly optimized.
Oop should only be used to prototype algorithms, not used in release code.
>Look at Rust web framework sample with Attributes and Extractor black magic.
>Code is so abstract no one call tell what the assembly looks like
Might as well create a DSL...
Rust is the Chat-GPT of programming language, literal black box, also slow.
There's bigger issues (though abstraction dogma will frick up the legibility of any codebase, if you're not running it on a server and I still can't step through it in a debugger it's genuinely moronic) like how C++ OOP introduces endless "features" to solve problems C++ OOP introduced in the first place.
Rust can be written ok (you wouldn't think it but it is actually pretty OOP hostile at it's core despite crippling itself with C++ "features" like zero cost abstractions) but the language itself is unergonomic as all hell and gets in the way when you're trying to get things done. I'm not talking about the borrow checker here but shit like refcounting (due to the aforementioned zca's) and hiding 99% of features behind +nightly for years on end.
The culture of the lang is also pretty damn bad with a ton of people just pulling in crates like it's NPM so if you pull in one library (like say a HAL for your embedded platform) those 1-2 libs you add will pull in a few creates of their own which pulls in dependenceis of their own and then you suddenly have 90+ crates compiling for your hello-world ballooning your compile-time.
C++ doesn't force OOP though, it's optional and can be used only where it's really needed, or not used at all
>C++ doesn't force OOP though
I'm sure in a team of dozen programmers no one will pull a Bertrand Meyer on me.
>Bertrand Meyer
QRD?
An evangelist who wrote multiples books about OOP.
This thread is underage b& and OP has never worked in the software industry before.
OP here, there is no such thing as the software industry, it's the called coding interview industry.
Once done you can code all the garbage you want.
Rain World
>a 2d game made with unity
>Uses minecraft Java as an example
moronic
Many such cases
I've heard all my life about these "god developers" but where are they? who are they? what have they done? I only ever hear millenials fluff about john carmack and mark cerny.
imo it's not as much individual god developers, but rather teams of people who were willing to put in time and effort to really bring the best out of each other's work. something i've noticed about working with other programmers in game dev is that everyone kinda sucks at something, and everyone has something they're really good at. so you have a guy who can write incredibly awesome looking shaders, but can't really put together bigger architectures. and you might have a gun who is great at architecture but is completely shit at working with quaternion and vector math. and a third guy who is not really good at anything except looking at other people's code and finding bugs and optimizing them.
when you put 10 of these people on the same team and have them all be friends, they're going to make awesome shit. the end consumer is going to think that whoever made the water physics & graphics is a genius, when in reality it was actually 3-5 morons combining the non-moronic parts of their brain for it.
nowadays, game dev companies have a bit too much red tape and bureaucracy to have this kind of development to really happen. everything goes through tickets and management, and generally, someone taking the time to root around in someone else's code and offer improvements is going to be seen as time wasting and toxic. it's not always like this, but it often is.
This, corporate culture is cancer. Also devs spend too much time optimizing their Jira points rather than doing actual work nowadays, cause points is all the upper management looks at.
>cause points is all the upper management looks at.
let's not forget the important of metric of "how often the teams icon turns yellow when the manager is looking at it"
but yea 90s and 00s game dev felt so innovative because it was a bunch of nerds cowboy coding with management or suits to please. it wasn't perfect, as it also made a lot of projects waste time and money because of endless prototyping and pivoting (see: halo and tf2). but i think it's the main factor why everything today feels so dry and soulless in comparison. everything is done waterfall style according to a game design document, and that game design document is based on some previous entry in the series or genre. you end up with more consistent development, but there's nothing to compel either the developers or the players to discover something new in games.
>with management or suits to please
meant "with NO management or suits to please"
>that post
vid fricking related, replace music with video games
Regular software has suffered greatly from this. The computers I use at work have about 8GB of RAM and struggle trying to run Windows 11, edge and a cloud based inventory software. I have gone out of my way to clear the systems of bloatware too. 8GB would've been a supercomputer to do the exact same shit 20 years ago.
yeah you pretty much need at least 16 GB to have a normal windows experience now.
I have an old one on w8.1 with 6 GB ram, it rarely gets over 3 GB even while some heavy browsing. It only got like 30 processes running. I really miss how stable and performant this thing was.
>...
>10 seconds on an SSD
HOLY CANCER
that video was an ad by the way, not some random yt
forsen
programming won't exist anymore in 5 years anymore.
AI will be able to do it much better, much faster and for much cheaper than humans.
god I wish
>AI will be able to do it much better
>Trained on stack exchange data
If you are a good programmer you would be a moron to work on vidoegames, long hours, lower pay.
you're right but I could see someone being interested in it for the challenge, if they're really passionate about their job
programming under technical constraints is an art in itself
For 95% of people that shit goes out the windown by your late 20s because whothe fricks wants to keep working like for shitty pay. Unless you are someone like Carmack but even he has his own business and getting leadership positions.
friendly reminder the good programmers never bothered with making games
they were always fricking garbage, hardware was just cheap enough to brute force through their moronation
>Be regular software engineer
>make 3-10x as much as a 'game dev'
>get company stock for lower taxes
>have good work life balance
vs
>game """dev"""
>make 5 figures
>lots of overtime
>but at least you're working on video games!
No sane person who gets a CS degree would want to work as a video game programmer
No sane person gets a CS degree
Why?
It's presently worthless
Proofs?
By the time you graduate, even if your unicollege did teach you useful shit (which is unlikely), most of it will obsolete
job market is a little fricked for mid and senior level workers, and completely fricked for juniors and entry level developers. there isn't much growth in the tech sector so companies generally aren't hiring that much, and the jobs that are hiring get spammed with a ton of bullshit applications from third worlders that have no actual chance to get hired. companies can't cope so they either pick applications at random, use automation to pick one or they just hire through nepotism or headhunt to save themselves the trouble (preferred method)
I have 5 years experience and I've been unemployed for a fricking year at this point because I can't even get my foot in the door to all the goddamned ranjeets SAARing their way through every interview. For example, I had a company tell me they were interested in me about 3 months ago and they still haven't gotten back to me with a time for an interview but insist that I am both not out of the hiring pool and going to be interviewing with them soon. It's incredibly obvious they are just running through every single non-white, especially the Indians, and it's taking them fricking months and they're not making a hire. Can't fricking stand this country anymore.
yeah i had been in the same boat for a long while. i did get employed eventually but i discovered that you basically have to look for obscure, preferably local hybrid jobs that won't have the same deluge of application spam. linkedin and other big job sites are a massive waste of time and will only depress you further. i won't pretend this will solve your problem but you might be best off sending open applications to smaller businesses that aren't explicitly hiring and saying "i know my shit and will reduce your workload for cheap" and then hope to job hop to something else a few months later
lmao no.
I don't value the degree but I can tell you that credentialism is uber-strong in certain sectors that love hiring junior employees like banking.
Think thousands of jobs all of which are hidden behind a HR harpy requiring at least a bachelors in CS.
Why did you quote me?
>I don't value the degree but
>make one indie game
>rake in 9 million
>2 years of work
planning my retirement at 30 is a treat.
>i-i-in the past all games programmers were like carmack!
it's hilarious to see people genuinely believe this when the reality is far more grim
he has always been the exception, most of your favorite old games were made by morons who had no fricking clue what they were doing, but did it anyways out of desperation and failing at everything else in life
You can either bust your ass learning C++ and Unreal Engine only to get replaced by a colored hair freak and a 50k salary if you get hired, or you can learn some easy as shit meme JS framework and get six figures.
So how come Dyson Sphere Program which is developed by the chinese runs so well
it's a unity game
Amateur indie programmers I can understand but I still won't buy their half-baked games unless it's well-optimized.
As entertaining Lethal Company's gameplay is, there is objectively no visible justification for the system requirements that it currently has, for example. It looks like Deep Rock Galactic with smoothed out polygons
minecraft is unfinished dogshit, Dragon Quest builders is 1000% a better game.
>*blocks your path*
How is that an example of a good programmer? It would be fricking sad if his game ran like shit.
>fixes bugs in a timely manner
>code is legible, allows external modding
>communicates with community
>still updating game to this day
>will run on literal potato
>cucked terraria's dev
>woman prefers the game that is more about running a household and having relationships
no shit
The problem isn't that there's no good programmers left, it's that there's too many programmers in total. It became known as the "easy" get rich quick job just like law so everyone and their mother wanted to become a programmer which meant we got a flood of low level programmers who could push down wages.
The truth is that if any good programmers have left it's because of their extremely homosexual desire for an exclusive club.
>extremely homosexual desire for an exclusive club.
rust
That's an exclusively homosexual club.
>oh you're here? we were waiting for you * slowly open door *
Minecraft is coded like shit. Everyone always knew it. idk if that's Java but Notch literally just took giant swaths of infiniminer's code, changed it around from C# to Java, and threw it in.
>infiniminer this infiniminer that
there were multiple games with that cube gimmick, include Cube 2 engine and cube world
Okay but he outright took algorithms from infiniminer. He was in the infiniminer threads playing infiniminer with Ganker before he made minecraft and even talked about modding infiniminer with mobs so it was more of a challenge
There are nothing particular about adding and removing cubes, don't call that an "algorithm" ffs.
There's massive brain drain because the vast majority of the industry's profitability has been siphoned off by marketing types
When a game makes billions, the developers don't see any portion of that, it all goes to celebrity promos and stock market dividends.
Developers don't have any reason to excel because it's purely a labor position now. You get the bare minimum because they're only paid the bare minimum. That's why so many developers are abandoning the industry, because you can do the bare minimum in webdev and make twice as much money without sacrificing your personal time and health.
It's a very sad situation when your require less than a fraction of the skills for enterprise software but get paid like 10x more. The only high skill high wage positions are in AI.
>your require less than a fraction of the skills for enterprise software but get paid like 10x more
Look ESL-kun. It's simple. Society worships money. The more money involved, the more you get paid. Entertainment is a superfluous luxury, while enterprise software is a money-making tool.
What's so hard to understand?
I work on enterprise software dumbass. I know where the money is but it's still baffling such a low skill area is paid significantly more.
This isn't even remotely true. If anything, most game devs are overpaid and barely work 3-4 hours a day at most. When you see them whining about crunch, it's because they refused to do anything for months and not suddenly have to get things done.
These are just the senior producers. The actual programmers and developers routinely work 12 hour days.
https://www.dice.com/career-advice/game-developers-how-many-hours-per-week-do-they-work
You only hear from the senior devs, that 21% working a maximum 40 hour work weeks, because everyone else gets laid off at the end of a project. So their opinions at any one company do not matter.
This is a product of terrible management. Games spend most of their development lifetime in preproduction, actual production happens in a 1-2 year span.
Most games are terribly managed. Most producers are terrible managers kept afloat with money. Most devs are terrible at budgeting time as most CEOs of dev companies come in from places like Hollywood nowadays where the work required for any given change isn't proportional to the returns received, and most of them never know that.
it's not terrible management of companies it's terrible mismanagment of funds and state laws, they don't need to make good games
>overpaid and barely work 3-4 hours a day at most
Median gamedev is ~$54k, median webdev is ~$80k.
I wouldn't do more than 3~4 hours a day for what they're paid either. You'd have to be completely lacking in self-worth or respect to give 100% at a job that's paying you below the line when other industries that utilize the same skillset are paying far more.
It comes down to the egregious bureaucracy of game dev companies nowadays, that coupled with incompetent executives and the “bottom line” is just disastrous. AAA games are failing more than they succeed currently, which is a scary realization.
who tf would work on video games as a java programmer. There's a lot money to be made in Enterprise Software, and as a bonus you don't have to work with troons
There is indeed not enough dedication to learning first principles, low level workings and how software runtime actually maps to hardware.
However, the true main issue of the industry is that manglement exists and stakeholders need to be pleased. Manglement will gargle investor wiener for money all day. They will churn through workers and their mental health if it makes their bottom line better in the short term.
There is no long-term planning whatsoever and engineers are seen as replaceable, which is far, far from the truth, which can be seen due to the state of many of the newer releases.
>There is no long-term planning whatsoever and engineers are seen as replaceable, which is far, far from the truth, which can be seen due to the state of many of the newer releases.
GPT proves this is the case though.
Anyone who thinks 'AI' is anywhere near being actually useful outside of creative endeavours or a danger to programmers or their jobs is missing a few braincells.
AI is still in baby mode, coders will be phased out or abstracted to the level kids will be able to code full games. Models, Voice, Logic all of it. Some ways off but will easily see the dawn of AI in our lifetimes.
>OMG LLM CAN DO EVERYTHING
You don't understand what AI even is.
Neither do you. You think AI can think. It doesn't. It's nothig but a set of pre-programmed macros that pulls info with an algorithm and regurgitates what it's been "taught" aka programmed.
AI is a sophisticated macro, nothing more.
I highly doubt it. If you need to tweak the code the AI generates, that means you have to understand it.
If the AI is smart enough to generate/tweak everything, then everyone is out of a job.
LLMs are not AI in any case.
Generating the creative side of the slop is slightly possible I guess, but it will be highly obvious.
Why?
Because the current AI can't think.
>All the good game programmers have left the game industry.
At last they are doing some more necessary things for society
It's called the competency crises and yes it's being driven primarily by Women and Diversity Hires
How bad will GTA 6 run when its released?
windows 11
i9-12900K
64 GB of RAM
400GB install ssd required
3080ti for minimum settings 40 fps 720p (rescaled to 1080p)
the mega autismos work in fields no one wants to do like devops to get megabucks while working on side projects at home (usually something like decompilations of their old favorite game or enhancing an already decompiled game)
example: kaze emanuar
Aren't video game developers paid jack shit compared to any other industry? No wonder guys like Carmack fricked off to work on AI.
As a software engineer the reason I don't work on video games is because I've never so much as heard of a recruiter trying to hire for video game development positions. In college no video game company would show up for career fairs. As an adult none of the recruiters who call me or hassle me over linkedin are representing video game companies. I feel like, similar to art, video game programming positions are taken by people who really want to do them which leaves potentially more competent people who don't what industry they work in unavailable.
I was contacted by Ubisoft once just because I have a toy 3d engine repo in github.
makes sense, a friend of a friend works at ubi and out of 8 hours they have, only 2 or 3 are spent actually developing something, the rest they just dick around
Tbh, 3-4 hours of non-stop work is pretty exhausting.
thats literally every office job
humans are incapable of more
anyone who says you actually work for 8 hours a day in an office instead of half of that or less has never worked there in their entire life
you get to do some stuff but otherwise you can take a break whenever you want and deadlines aren't very strict unless you are dumb or extremely lazy
only some low level drone jobs have you work the entire time
>a friend of a friend works at ubi
not for long, i give it 3 months max before they let go thousand of people
it's not too late to gain an interest in low-level programming, is it?
No. I learned C++ 2 years ago by spamming cppcon back to basics, reading a tour of C++ and making Space invaders in GLFW. It has served me very well.
>C++
Ah, so you're from India.
No lmao. Computer Graphics is a thing moron.
Poos are too stupid for Cpp
It never is. It's very simple.
Naw. The best paying jobs all still involve C++ and Rust. In a way it's impossible to phase out since Moore's law is le dead.
Actual low level programming requires a lot of theory though. Don't think it's just learning Assembler for x86 and ARM and a bunch of simple languages. You need math intuition to make it anywhere.
In my experience good programmers who want to get better can't pass a personality screening to get jobs in the industry and they don't last long if they do get in. Tech used to be hailed as a haven for skilled outcasts but it's just another white collar corporate job now
It's not about skill. It's not even about pay. I'm going to ignore the actual illiteracy of a generation brought up on abstractions for a moment (since millennials really are no better) and instead point towards the job they're required to do.
When you're building a Minecraft from scratch you test things. You learn the source inside out because you wrote it. You can imagine side effects and what where is giving you issues or what could be optimized so you're not wasting compute on worthless bullshit that could be achieved more economically.
When you're hired by Pajeetsoft to work on maintaining and expanding their Shitcraft with a whole host of issues and given code that already has shitstains and dried cum on it from the hundreds of people there in that position you're in right now, each gradually more incompetent than the last as the corporate ladder went its merry way, you stop giving a frick. Unoptimized implementations? Let the compiler take care of it, sirs. Side effects? That's a feature, sirs. Memory leaks? Unused dedicated wam is wasted wam, sirs. And so on. You hold on tight enough and stop giving a shit. Minecraft is an extreme case, but you see this in everything from in-house software for companies willing to pay much more attractively to mere engine work with Unreal, Unity &c.
Only your creations are worth the pearls of your intellect. Want better programmers? Stop buying goyslop.
yep. and its been that way for what feels to me like a decade already. nothing new to look forward to, just more always-online p2w cheap uninspired unfinished buggy crap riddled with sjw bs
whats with the aicels itt
your always gonna get paid to baby the dumb ""AI"" lol
>Come to think of it even CGIs from recent movies are absolute dog shit
I heard people criticizing the latest Disney film. If they're right, then it's something quite interesting to note, because Disney isn't the kind of company that will cut costs on the quality of their 3D (they'll cut costs by, for example, migrating from 2D to 3D, but not by making the 3D visibly worse to save money).
The only explanation I can think of is that the talent working in the industry right now isn't as good. But it's hard to believe that when you can open any Blender tutorial on Youtube and see some guy single-handedly creating high quality models...
So it's a kind of mistery... there's something strange going on in the entertainment industry in general.
>So it's a kind of mistery... there's something strange going on in the entertainment industry in general.
Nepotism, diversity hires and grifters getting hired instead of actually skilled people.
HR departments are run by women who hate awkward nerdy guys (who are usually good programmers)
For any people currently learning programming and getting discouraged by the crabs in this bucket of a thread:
90% of all claims made by anons are as usual exaggerations, parroting, unwarranted elitism from people that think Ganker's opinions on anything matters and straight up lies.
Even the OP starts with a false premise not taking into account how the industry has grown over the last 20 years, ignoring that there were badly performing games in those 20 years and how Minecraft was written in Java.
Keep learning and ignore doomers.
i just checked junior job offeres in my area and theres almost nothing and the requirements were super high i really doubt i would ever get a call back if i decide to go the "self taught" programmer route
Where I live a lot got fricked over, so it's only Java programing jobs for software related stuff. One thing I hate is how most companies will have shit for "online only jobs", but then as a requirement you have to live within 20 minutes of their company, which is just moronic and makes the filters for searching within an area so much more annoying, since they flood most of the search results. How many job searching sites are set up are awful and feels debilitating on purpose reading everything to learn you cannot apply to it since it wasn't filtered out with the wasted time.
>One thing I hate is how most companies will have shit for "online only jobs", but then as a requirement you have to live within 20 minutes of their company
probably done so that they dont get spammed with people trying to get remote job from india and shit
why learn programming when you can just use unity or something and just buy assets you need :^)
be the change et cetera
Meanwhile in Japan, people are making full-blown Age of Empire, Doom, and other game clones in RPG Maker 2003. Let that sink in.
That isn't hard
Great, let's see your RPG Maker 2003 age of empires clone anon. I'll be waiting.
That's insane, but I think they use that mod for RPG Maker that gives you more freedom with variables, images, etc., as well as significantly expanding the limits (for example, instead of 20 images on the screen, you can manipulate thousands of them).
Basically, it's a less effective way of putting "sprites" on the screen and programming their behavior using RPG Maker's variables and image-moving functions.
>Pixeleted indie game
>its unoptimized trash which grinds my GPU and CPU at 60% + no matter the scene
hate how common that is
Software dev gives double the amount of pay as game dev with less demands. Both are kinda fricked with jobs in terms of supply vs demand. Employment with getting a job at a company has been pretty fricked in general since the Pandemic, where the demand is nowhere where it used to be.
Minecraft almost always had horrible performance. The only time it was even remotely good was back in the pre-alpha days when it was nothing but a building blocks simulator.
cope
OP here, if you deny something so obvious as software becoming trash then you are part of the problem
meant for
have you seen blueprints in unreal
it's not programmers writing the games anymore, it's designers making visual script travesties and then wondering why it runs like shit
all the programmers are doing is trying to keep things hanging together and playing daycare with the designers
Anyone claiming Minecraft was written by a good programmer to begin with is literally braindead. Notch's originaly code was using OpenGL 1.1
making a fun game is harder and more important than good programming