WIP
Alright, hyper-realistic, yes... but, only for LIVE rendering. Obviously, you can't do EVERYTHING on the older hardware, that's a given. But even today, pre-rendering (and internal caching) is used to reduce the load from rendering a graphic or set of graphics every frame to simply displaying them. This has limits, and does sacrifice filesize. But, compression and well-optimized file formats can mitigate this. My other complaint with the GPU is it's inherently difficult to program, and all these wonderful things involving vectors and bitmaps have to be ENTIRELY re-designed. This forces devs to code their own stuff, or import amount of libraries, putting them in major dependency hell, and easily bloating the filesize of their programs just to include a few basic procedures. The core application can be a few megabytes, and throw in your libraries, it can skyrocket in size to several, or reach gigabyte sizes. That's less room for assets, and also increased load times. It can add hundreds to thousands of new places to encounter a serious bug, too. The GPU is nice, but it's a pain to use, and it still doesn't get used RIGHT half the time. (same with RAM and CPU, and as earlier stated: computers only FIVE years old rendered 'obsolete') The GPU and all modern-era hardware have to be treated diffently, and too few people know or have access to the tools to do it properly. (Or, they are forced to meet unrealistic deadlines and/or be told how to do it by so-called "executives" and "government policies" , the majority of which lead to creating HIGHLY unstable or controversial programs, which neither the programmers or end-users agree with or endorce, but "rules are rules" , right?)
Which brings me to my next point adding on to what was said about 4k. Indeed, it doesn't NEED to go there, but it WILL... I don't like DPI, either... Mobile devices are notorious for being an absolute pain in the ass with graphics display. Modern TV's are NOT recommended for older games because they lack a wonderful thing called "scaling" (and because their DPI combined with resolution make it nigh impossible to see what's going on in your game) I don't think these hyper-HD models are needed, either. I would actually rather use low-poly models that weren't finished, because story was lacking and needed to be finished, then spend my remaining time making several million+ poly 100MB-1GB per file models with 2048 uncompressed RGBA32 texture atlases and advanced shaders. Graphics are important, but should not be the main selling point of a game. I couldn't give two shits about the chameleon in the background or shiny reflective water (that are eating so many resources hundreds of trees had to be deleted). I care about gameplay and story. Graphics are nice, but make a game, not a freaking interactive movie. I don't care about COD and most of the other FPS and so-called "AAA" games because they are overrated in my opinion. I wouldn't even pirate them, assuming I had a machine that could run them. I obviously wouldn't buy them, and I'd rather not make one. People are too obsessed with "HD graphics" Let's pretend you had a super-computer with infinte power and disk space. You could make ANY game up to what its hardware and/or OS are capable of. Even then, you shouldn't use the same bland routine of "what worked before" The revolutionary and now well-respected games, programs, and systems of the old days didn't get where they are by being the same as the last 10,20,100+ iterations. They did something NEW, and not new as in "push boundaries of known tech carelessly to which point it was obsolete" When they DID challenge what was already there, or even in development, they used what they had better. Games made by the OLD Rare and Nintendo show this quite well. Even EA and MS once could be taken seriously. Now they're a laughing stock and are hated for RUINING everything they touch or create. Pixel art, low-poly, vector, voxel, etc... ALL have a place. HD surrealistic whatever is not the ONLY graphics style, FPS is not the only genre, pay to win is not the ONLY way to make an MMO financially stable, RTS doesn't mean "REAL (literally) time strategy" , etc... One reason I now want to be a game dev is to try and bring-back some of what made the gaming industry great. Show people a REAL game, something that came from the heart and soul, not another sequel of a sequel of a sequel of a sequel of a genre that's been done a million times over with "better" graphics or "tons of unlockable content" that has "revolutionary cloud backups of all your saves" is currently in "early access" and is built on "THE BEST ENGINE IN THE WORLD EVER FOREVER". WTF...
Now, HTML5... A LOT has to be done. Given my great disinterest in toughing the GPU, I won't touch webGL, so I can't say anything of webGL's features. However, the bulk of GPU-based libraries/sites, and devs who encourage you to use the "wonders of the GPU" do not HAVE or PROVIDE the means to do things that software renderers have been doing for years. "want a curve? it's easy, DO IT YOURSELF!" ok... Class LineTool extends SomeGPUAPI {/*TONS of math goes here*/} ... or import gfxLib.* Now, canvascontext2d... I am using this, and I even found a 3d renderer based on it (cool!) It does support all the basic vector-drawing stuff the default "hardware accelerated stuffz" REFUSES to implement, however still has multiple caveats and DOES require rendering engines built overtop to be incredibly useful. Depending on the browser, it has "security" features that make it difficult/impossible to test stuff on local machine, or distribute a stand-alone downloadable version of your application. (in fact, all of HTML5 does, especially when dealing with the "oh so incredibly progressive" browser that is Google Chrome) Now, there are actually multiple support inconsistencies, most noteably with CSS3 -webkit -moz -o -ms , whatever... STANDARDS mean there should only be one or a select few methods to do something. These should be generally decided by a central organization. (in this case, we have the so-called "W3C") These vendor-specfic tags/prefixes/functions are all decided by browser vendors, but nonethless, W3C doesn't seem to care that this is a hassle for devs, as they endorse this by providing nearly all the documentation on how to do it. result: what should take one line of code takes 4+ to (hopefully) target every browser. I won't accept the "HTML5 standard" as a "standard" until all these experimental features are universally supported, 100% documented, and not locked behind vendor prefixes. "HTML5" games should be called "HTML5 Canvas JavaSript" games, because all of the code in most cases is Javascript. They should not be called "modern" or "new" until JS becomes better structured and better optimized. All that really changed is some new security, syntax that supposedly can mitigate the use of hacky code, a few changes/additions to the existing APIS (like XHR2, which if named properly should be "GeneralURLRequest" , since it's designed for stuff like .txt .json and .bin , not exclusively XML) , the addition of some new APIS (HTML5 Canvas, HTML5 Video, CanvasContext2D, WebGL, etc...) , very small stuff. It's behind the EMACScript defs, even, which include more OOP stuff. JS uses hacks and VERY ugly code to emulate "proper" OOP If you Compare Flash AS3, Java, and C# (cuz, unity) there's a true OOP thing going, most especially JAVA. Modern games and programs use heavily OOP, and now are adding these Entity Component System things, which I gotta admit sound useful, but are presently beyond my grasp of understanding. Now, ignoring language features, Flash had an EXCELLENT system for animations. I think that flash should remain (or a system similar to it) to deal with animations. NO HTML5 library/engine I've seen to DATE can compare. JS is waaay older than flash...so I don't justify calling HTML5 really modern. I do welcome dropping XHTML, though. I could go on and on about how many reasons I do not like it, and even consider it's die-hard supporters hypocritical when saying how much "better" it is than the plugin-based systems that are FAR more developed. However, that would entirely de-rail the thread, if this hasn't been done already.
Due to in part the way I was raised, and in part what I have learned growing up, especially in recent years, I see that EVERYTHING must have a balance. If you walk/look exclusively down one path, you inevitably make no real progress, you actually go backwards. You shouldn't be a afraid to do something new, but you shouldn't forget what you or someone else have done before. In the right cases, you should bring an idea back from the past, or you can try doing it. The problem with the tech industry is that they either try too many new things, or or taking something old and by all means obsolete or not yet mature, and re-branding this as something 'new'. They also like to re-use the SAME concept and not change it much, if at all. (note-ably with games) They also don't balance-out relevant issues. (security/privacy vs convenience/economics) As a result, they're going backwards. "superior" GPUs can't do the same stuff that was originally given to CPUs to do and the excuse for this is "do it yourself". Security is SO much more "important" that the END-USER is losing their RIGHT to have 100% control over WHAT is on their machine, WHEN it runs, and HOW it runs. I like flash games/animations, and I have yet to get viruses from one, few caused me lag (those that do typically use HD assets or poor resource management). Now, all the games/animations/sites that I love are in jeopardy over a global flash blacklist in any web browser or mobile device and a large corporate-sponsored effort to snuff flash out. I should be allowed to opt-out of this B.S. I accept the potential risks of a virus. I'm sorry, but Google, Mozilla, Apple, Microsoft, IT'S MY COMPUTER, STFU AND LET ME INSTALL THE PROGRAMS I WANT! Oh, and quit saying "HTML5 is more open than flash", because nearly every prominent/existing (and well-promoted) engine/tool for it is proprietary.
The closest to replacing Flash is CreateJS. But having similar super-complex API and no engine/IDE for it? Its just not NEARLY ready to replace flash.
Now, textwalls...
On the site I was on before, people BOAST about how a post is "tl;dr" for them...
I also deal with this on youtube. a lot. And what REALLY grinds my gears... I shorten it (and I've also witnessed other comment chains with similar transactions) , people will complain "not enough detail" thus, a viscous cycle of a wasted attempt making an "optimal" post/thread starts. Ofc, there's plenty of worse comments to be made about the people writing long posts as mine. I have dealt with those, too. Like this general assumption everyone writing such a long text wall is autistic... >.> (Which is ironic, because some people matching said description would be totally incapable of knowing HOW to write something so expansive and detailed, as some have SERIOUS problems listening to the teachers saying to EXPLAIN what you're trying to convey. Some also are so unaware of the world around them they lack anything to contribute the content of such lengthy and meticulously crafted posts)
I think that's everything.
Hopefully my next replies if any will be shorter.
People saying textwalls take TOO long to READ should try writing one. takes AGES. This one took me a couple hours... It should take 10-30 minutes to read, roughly, and the most time may be spent either replying or thinking-over it, or researching ways to re-enforce/challenge these statements...XD (why do we humans like using exponentially more time/effort creating something than it takes for it to be appreciated/consumed?)
|