From SheerUglinessOfUserInterfaces In my experience the typical computer user doesn't understand very much about computers at all and anything that can give him or her a reference point to something they do understand is helpful. Even simple stuff like the representation of an external "hard drive" to which users can drag a representation of a piece of paper (document) goes over most people's heads. The average computer user still does not understand that electronic documents are objects that have finite locations, can be found in those locations, and can be moved to other locations. They don't know the difference between the documents and the applications that are used to create and edit them. To get this point across we may actually have to get a whole lot ''more'' "physical", not less; perhaps the desktop should be made to resemble a file cabinet or a bookshelf and users should be required to use a mouse to actually open "drawers" or remove books; perhaps the mouse pointer needs to look more like a grasping hand that can pick things up and carry them. Most users don't even know how to resize or move windows or sort lists. They freak out if they accidentally maximize a window and assume they broke something; if they accidentally minimize it, they think the program has stopped running. The less onscreen representations resemble and behave like physical objects in the real world, the more confusing and intimidating they seem to be to most people, so those people simply never use most of those features. I don't know what it's going to take, actually. I do know that the average computer user is even more confused by "non physical" computer-centric representations than they were by the DOS command line. I certainly don't think that the average user is going to be able to cope with any scheme that assumes that computers can teach them to interact with information in more direct or less "physical" ways. So I think it's better to allow as much user-controlled "skinning" as possible so individual users can be as comfortable as possible with their machines. If it makes people feel safe and happy to have a video player application look like a Gameboy or even like a little puppet theater; if it helps people navigate to make an appointment calendar resemble a spiral notebook with "turnable" "pages", then let them have these crutches. Different people conceptualize the world differently, and they need different representations to match. To propose authoritarian restrictive standards on how things should or should not be represented is mistaken on its face, and such proposals have a long and unsavory history in the worlds of art and design that began long before computers were invented. Such proposals inevitably end up looking stilted, and their promoters sort of wild-eyed and obsessive, just a few years after they are adopted. -- KenDibble * An early deployment of the xerox alto desktop was to the presidential staff in the white house. The parc folks were very proud of their metaphor and adheared to it carefully. For example, when you dropped a document on a printer it would show up on the real printer and disappear from the desktop. If you wanted to keep a computer copy after printing there was a handy copier icon near the printer that could make a copy for you. Well, it seems that the first time an executive document was lost by this mechanism, the staff just called up parc and told them to get the documents back and don't lose any more. There is an '''''extensive''''' literature on UI design in HCI. There's a whole ''subfield'' dealing with how stupid it is to slavishly imitate physical objects. Like creating software calculators with "buttons" for each of the digits. Or a "desktop" with "cabinets" and "drawers" from which you can take "books". Hell, there were theories of learning made precisely to explain how people do and don't learn UIs. You can't do good design work by following rules. And that's what these people are trying to do. They see "people deal well with physical objects" and they try to transfer all the rules of physical objects over to software objects, hoping they'll get the conclusion "people deal well with software objects". But it doesn't work like that and it's completely stupid. It's stupid because software is far more powerful than mere physical objects can ever be. It's stupid because people deal with material objects through a complex and rich 3-dimensional tactile and textured interface, whereas they deal with computers using only a very narrow set of keys on a keyboard and a single 2D pointer. But anyways. There are entire books written on this subject and I'm not going to be transcribing them here. I'm pretty sure that AlanCooper writes about imitation in AboutFace since I recall reading about it recently and that's the only book that would fit the criteria. Yes, I seem to recall an anecdote about someone creating a cellphone UI in a slavishly imitative style. Without going into those books, I'll only say that it is wrong to represent physicality if you're not going to go through with it. And it's uber-wrong to be SlavishlyImitatingPhysicality. A real designer who wants to exploit "people deal well with physical objects" wouldn't slavishly imitate them. They'd figure out the principles guiding '''why''' people deal well with physical objects and then apply those to a UI while freeing it from all other arbitrary constraints. -- RK Apparently the people who write these books haven't spent much time watching typical computer users try to figure out how to deal with all the "non-physical" conventions that exist in popular GUIs. Computers are as common as toasters now. Anybody with two or three just-above-minimum wage part-time jobs can afford to own one--and does. There's also research demonstrating that sizeable numbers of these people don't even have much capacity for abstract thought, let alone the ability to deal with non-physical abstractions for information. I'm aware of some of the research on interfaces, and I doubt very much that the experimental subjects were people like this. I'm not married to the idea of SlavishlyImitatingPhysicality. I'm pointing out that today's GUIs are largely incomprehensible to non-geeks, regardless of what highly reputable design theorists and expensively-paid designers predicted and expected, and I'm adhering to the thesis that making them more abstract will not help the problem. I'm also reiterating the thesis that broad, sweeping, authoritarian design "manifestos" are doomed to rapid extinction and subsequent ridicule. Representation of information, to be maximally effective, must take into account the fact that different people perceive the world differently; therefore, it must be customizeable to meet individual needs and preferences. If some people prefer graphical representations of ... well... graphs, let them have them. If others can function better with bookshelves and movie theaters, let them have those. -- KenDibble Nobody said that WIMP (the standard GUI) is any good. In point of fact, WimpIsBroken, MenusAreEvil, WindowsAreEvil, PointersAreEvil, ButtonsAreEvil, and IconsAreEvil. Wimp wasn't created by interaction designers, or designers, or HCI researchers. You're preaching to the choir. For instance, when you said that "users can't distinguish between applications and files" I thought "of course, and why would they when both are presented as objects?". But don't let me stop you since I've been dealing with too many programmers who think software is just fiiiiiiineeee. I'm actually enjoying this. -- RK ---- Imitating earlier technology is a common pattern. Early cars imitated horse drawn carriages, etc. Software is no different. After people grow accustomed to software they will let go of imitations of physical devices when it suits them. ''And that's supposed to happen when, in 20-50 years?'' It happens whenever it suits them. Remember, these are the same people whose symbol for "prescription" looks like the eye of Horus. Don't sweat it. [No kidding. I guess you learn something new every day: http://www.straightdope.com/mailbag/mrx.html] ''Don't sweat it? These programmers that are imitating horse drawn carriage UIs are doing so for software that '''I''' use. Why should I forgive them for inflicting their intellectual limitations on me?'' They don't need your forgiveness because you made the decision to use their software. Quit whining and write your own software if you don't like theirs. ''I "decided" to use their software just like the Americans "decided" to have Herr Shrub in power. And how does the song goes? Oh, yes, "Quit whining and leave the country if you don't like it."'' Wait, are you trying to say that someone else picked the software you use? Are you in an institution or something? What's stopping you from using other software or writing your own? You obviously have a clear vision of what you want. Why spend so much time bitching when you could be making yourself happy? Or is it that bitching is what really brings you joy? ''What's stopping millions of left-wing Americans (socialists, communists and anarchists) from moving out of the USA?'' That's a specious argument. The software you use is a simple, personal choice. You won't be separated from your culture, lifestyle, family, friends, etc., by using different software. Answer my question. Why do you spend so much time complaining when you could be spending that time making yourself happy? ''Some people's mental abilities would be improved if their brains got aerated.'' Is that a threat? ''A helpful suggestion.'' ^ What the hell? Annoying straw-man dichotomy that goes into weird tangent.. ---- What model is ''not'' arbitrary? SoftwareGivesUsGodLikePowers to create any world we want, as long as it well-defined and provides the right answers (output) within a reasonable amount of time. Modeling file cabinates and paper may be an AbstractionInversion, but what isn't, and why does it matter? ''HumanComputerInteraction is science, not art. The workings of the human brain are not arbitrary. And it matters because some UIs are powerful while others aren't, some are honest and forthright while others aren't, some are easy to learn and use while others aren't, and so on and so forth.'' If it is science, then where are the vaults of experiments? A bunch of pscychologists sitting around and speculating is not "science". Science requires tests against reality in order to complete the circle. Further, I strongly suspect that the ideal interface for person A would not be the ideal interface for person B. Thus, the researchers have a tough job since there probably is no One Ideal Interface. Making a custom (or adustable) interface per person probably requires usage of SeparateMeaningFromPresentation. ''Yawn, there's plenty of experimental research in HCI. And I really enjoy people who cast aspersions on a subject they've never even heard of before.''