Technology: been there, done that.

vibrokatana

New Member
Recently I have been bored out of my mind with computers, particularly in the hardware field. There is nothing massively interesting besides companies trying to throw out slightly better trash for consumers to scoop up like dogs from the floor. The entire market is dominated by x86 processors which carry a bunch of legacy crud. The CELL which was supposed to be an innovation in processor design is only available cheaply in PS3s, which is an incredibly restricted platform. The software market can barely keep up with the advances in hardware advances resulting in laziness and cut corners.

I feel that we are coming to an impasse where something will break and change. The market now is stagnant IMO, it isn't changing dynamically to feel the needs of the consumers. Instead they pump out the same old stock and attempt to crush any competitors with soothsaying and false advertising.

It just doesn't feel right to me. Anyone else?
 
computers(and technology as a whole) has not done something truly innovative in over 10 years at least....however the building blocks are so solid that a major innovation has not truly been needed. It makes things "boring" but at the same time it allows things to progress well. Once a need that requires a massive breakthrough arises the breakthrough will come.
 
We've had plenty of innovation in technology and computers recently. We have flat panel monitors. Ultra-portable laptops. Smartphones. Pretty iPhones. MP3 players. Things are going wireless.

... the list goes on. I guess I'm not in the technology world very much, but how can you not claim the above items innovative?
 
computers(and technology as a whole) has not done something truly innovative in over 10 years at least....however the building blocks are so solid that a major innovation has not truly been needed. It makes things "boring" but at the same time it allows things to progress well. Once a need that requires a massive breakthrough arises the breakthrough will come.

The foundation is hardly in place. Things are still done at a largely foundational level to do even simple programmatic tasks. Threading is a horrible nightmare on virtually every platform. Object oriented programming is still immature in most languages and does not fully meet the ideology behind its design. Most tasks are saddled into serial programming such that the entire system can be brought down with just one instruction. Threads are seen now as a way to throw instructions to the side to prevent locking up a program or to increase capacity, which is an inherently wrong philosophy.

Innovation is desperately needed, things may be progressing but they are doing so too slowly. A programmer may "write" several thousand lines of code per day by reusing it but there needs something else. The model concept comes to mind, where you define "what" and the middleman (compiles/interpreter) handles the "how".

New hardware my be here, but it only builds upon existing tech. You are still looking at a 2d surface no matter how you say it nothing has really changed in the last 20 years besides decreasing the size and increasing the resolution. Chips are just getting bigger and bigger while not actually becoming smarter, they just stack on instruction after instruction or trying to speed up what has already been done.
 
20 years ago the core technology you put down was created by no names and people that just wanted change. I guess from reading you opinions the only thing I could say is "put up or shut up" so to say. If you don't like the path being followed make a new one. Prove to them that there are different ways to look at and solve the problems. (no offense intended I hope you know.)
 
everything we have now has been evolutionary not revolutionary.

Is it really evolutionary? A program requires maintenance from a third party for it to adapt, to change. The program is clearly domesticated, trained to act in a new way.

If the program was evolutionary then it would adapt to new tasks as it encountered them. A spreadsheet program should be able to interpret the data it gets wether it be XML, CSV, SQL, etc without having been told how to read it. That is true evolution, true consciousness.

Training a dog to roll clockwise when given a treat is fine, but what happens if you want them to roll counter clockwise? You would have to retrain them to understand. As you increase the complexity you will constantly be destroying old programming to adapt to new circumstances.
 
What would you like to see done?? Do we want the Wii to be combined with a computer?? What about touch-screen computers?? (which i have seen in BestBuy! Those things are so cool!!)

Forgive my youthful ignorance.. but isn't not about the hardware so to speak, but more of what you do with a computer? I'm not a technical guys, all I really care about is if my computer can run at acceptable speeds and if it can run Guild Wars. But what do you want that is revolutionary in the hardware line?? Lights around the Hard Drive (which they have around the RAM).
 
What would you like to see done?? Do we want the Wii to be combined with a computer?? What about touch-screen computers?? (which i have seen in BestBuy! Those things are so cool!!)

Forgive my youthful ignorance.. but isn't not about the hardware so to speak, but more of what you do with a computer? I'm not a technical guys, all I really care about is if my computer can run at acceptable speeds and if it can run Guild Wars. But what do you want that is revolutionary in the hardware line?? Lights around the Hard Drive (which they have around the RAM).

Is it really incapable of more? Some of the first mainstream "microprocessors" only had a few thousand transistors. Todays CPUs have over 500 million. What if you redesigned the CPU to be simple, but there were literally thousands of them executing in a more timely fashion then todays bulky processors. One of the hardest tasks todays CPUs face is copying data, which is why they are horrible for use in routing and why we often use specialized hardware instead.

What if instead of having to handle the low level design programmers could instead define what they are storing and how they present it. Instead of offering primitives, what if the higher level we the defacto standard. Instead of having to build the primitives, write routines to store and extract the data, and rely on an OS specific toolkit to programmatically display the data.

Nobody hard codes the bits for transmitting packets, they use either UDP or TCP to encode their data for transmitting. It is easier, simpler. By the same ideology you can move things up the stack infinity.
 
What would you like to see done?? Do we want the Wii to be combined with a computer?? What about touch-screen computers?? (which i have seen in BestBuy! Those things are so cool!!)

Why stop there? Too often people are concerned with the connection, the idea of peripheral. Does it really matter if data is stored on your local hard disk or wether it is on a server across the world? Should it matter wether a touch screen is connected to a machine or wether it is across the house?
 
...

So...

The whole idea of optics replacing electronics isn't exciting? The sheer speed in which Solid State Drives are gaining in capacity? Everything growing ever smaller, going mobile? Having your brain control a UI? The emergence of Multitouch technology? Robotics?

IBM still makes mainframes. I'm still being taught COBOL as a CS major. Some ancient stuff is still around and somehow kicking. Whether it should be is a matter of debate, but it serves the needs of its users. If you want true innovation in a field, you've got to give them a seriously good reason to change their tried and true ways 'for the better'. And that would probably wind up harder than convincing a Linux fanboy to switch to Windows Vista.

I say if you think you can or know better, do it. Or get a bunch of other people working on it with you (Open Source Development ftw). Or hey, start your own company. If its truly innovative and a definite step up from what is currently in use, people can be sold on it.

But really, the last thing I think of when I think of the computing market is 'stagnant'. Unless one considers the subsets of an industry. Like COBOL. COBOL should die in fire. Oh well.
 
I guess what I am getting at is that the current interfaces lack flexibility, be it programmatic or interface wise. Imagine if like the command line you could "pipe" data and modify it freely (even though the ls command may have no save function a modifier can be applied to save. Or even filter the results and save it. ex cat somefile | grep # > someotherfile). Instead of writing an entire program up using procedures you could construct one on the fly using modules.

Need a spreadsheet? Draw a table. Need to convert formulas to output in the cell? Pipe the data through a filter to the display. Need to spell check something, want to highlight syntax, etc. By incorporating reusable modules you could literally build up an entire application without any programming knowledge. Just like you can build up audio effects by chaining, splitting, joining, etc.

Even if a program needed to have it's own logic you could incorporate it with scripting languages like LUA or even straight up MONO/C# loaded dynamically. Or better yet the application code could be built into a module where it could be reused to do tasks. Todays world is horribly application focused, when it should be focused on the data. It doesn't matter what tool is used; rather it should matter what is trying to be accomplished.

By using small modules the chance to optimize the entire codebase at once becomes an even greater reality. Imagine if you had a single codebase for say sorting a table. Since you could optimize it to use say 8 threads since you have 8 cores, thus portions of the system could be retooled and optimized literally on the fly to improve the entire system at once. Since every programmer does not have to focus on doing the little things then they can focus on what matters: the user and their data.
 
Last edited:
optical replacing electronics is evolutionary not revolutionary. The computer when first invented was revolutionary. Fiber optics when invented was revolutionary..but fiber replacing silicon is not revolutionary.
 
I know. If I see one more smarmy Mac commercial...

Knock em if ya want but microsoft's commercials make me want to die inside. About the best they could come up with was the so called "3d" window switcher in vista, which is pretty sad. If apple could promote their OS instead of clobbering the competitor it might help. Still their sales are increasing year after year so they must be doing something right.
 
Back
Top