I installed the device driver, tool kit and SDK (after taking help from here and here for problems with make–gluErrorString and missing cuda libraries error messages) and I passed both deviceQuery and bandwidthTest. Next step is to make sure that I can use eclipse for cuda code development.
Archive for the ‘Computers’ Category
All the other PC makers back then basically saw their computers as industrial tools. What they cared about – and what most buyers had been told to care about – was the specs of the innards, things like chip speed and hard drive capacity. Jobs sensed that there was in fact a set of computer buyers who might actually want a computer that was the color of the ocean off the coast of Australia – and not only that, but they that might well enjoy forking out a little extra money for the privilege of owning such a computer. A computer, Jobs saw, wasn’t just a tool. It was a fashion accessory. And as the guts of PCs continued to turn into commodities, his instinct was confirmed: it was the outside of the PC – the shape of it, the color of it, the look and feel of it – that came to matter. His insight resurrected Apple and killed the beige box.
A nice piece.
But I disagree that the scholarly monograph is dead. Personally, I expect monographs to undergo a renaissance as more academics adopt e-publishing. Academic presses affiliated with universities should be going all-digital, and should start massively promoting their back catalogs as e-books at fire-sale prices. The smart ones will take the opportunity to change their agenda, competing to publish new books by a new generation of scholars who are building a broad readership both inside and outside academia. There’s no reason why we need to constrain our scholarship to books so boring that nobody wants to read them. Tomorrow’s scholars should be engaging with a much broader public than university presses have historically cultivated.
I agree; of course, there is a stumbling block, and remedies for the same are discussed in the piece. take a look!
What do we really know about creativity? Very little. We know that creative genius is not the same thing as intelligence. In fact, beyond a certain minimum IQ threshold – about one standard deviation above average, or an IQ of 115 – there is no correlation at all between intelligence and creativity. We know that creativity is empirically correlated with mood-swing disorders. A couple of decades ago, Harvard researchers found that people showing ‘exceptional creativity’ – which they put at fewer than 1 per cent of the population – were more likely to suffer from manic-depression or to be near relatives of manic-depressives. As for the psychological mechanisms behind creative genius, those remain pretty much a mystery. About the only point generally agreed on is that, as Pinker put it, ‘Geniuses are wonks.’ They work hard; they immerse themselves in their genre.Could this immersion have something to do with stocking the memory? As an instructive case of creative genius, consider the French mathematician Henri Poincaré, who died in 1912. Poincaré’s genius was distinctive in that it embraced nearly the whole of mathematics, from pure (number theory) to applied (celestial mechanics). Along with his German coeval David Hilbert, Poincaré was the last of the universalists. His powers of intuition enabled him to see deep connections between seemingly remote branches of mathematics. He virtually created the modern field of topology, framing the ‘Poincaré conjecture’ for future generations to grapple with, and he beat Einstein to the mathematics of special relativity. Unlike many geniuses, Poincaré was a man of great practical prowess; as a young engineer he conducted on-the-spot diagnoses of mining disasters. He was also a lovely prose stylist who wrote bestselling works on the philosophy of science; he is the only mathematician ever inducted into the literary section of the Institut de France. What makes Poincaré such a compelling case is that his breakthroughs tended to come in moments of sudden illumination. One of the most remarkable of these was described in his essay ‘Mathematical Creation’. Poincaré had been struggling for some weeks with a deep issue in pure mathematics when he was obliged, in his capacity as mine inspector, to make a geological excursion. ‘The changes of travel made me forget my mathematical work,’ he recounted.
Having reached Coutances, we entered an omnibus to go some place or other. At the moment I put my foot on the step the idea came to me, without anything in my former thoughts seeming to have paved the way for it, that the transformations I had used to define the Fuchsian functions were identical with those of non-Euclidean geometry. I did not verify the idea; I should not have had time, as, upon taking my seat in the omnibus, I went on with a conversation already commenced, but I felt a perfect certainty. On my return to Caen, for conscience’s sake, I verified the result at my leisure.
How to account for the full-blown epiphany that struck Poincaré in the instant that his foot touched the step of the bus? His own conjecture was that it had arisen from unconscious activity in his memory. ‘The role of this unconscious work in mathematical invention appears to me incontestable,’ he wrote. ‘These sudden inspirations … never happen except after some days of voluntary effort which has appeared absolutely fruitless.’ The seemingly fruitless effort fills the memory banks with mathematical ideas – ideas that then become ‘mobilised atoms’ in the unconscious, arranging and rearranging themselves in endless combinations, until finally the ‘most beautiful’ of them makes it through a ‘delicate sieve’ into full consciousness, where it will then be refined and proved.
Poincaré was a modest man, not least about his memory, which he called ‘not bad’ in the essay. In fact, it was prodigious. ‘In retention and recall he exceeded even the fabulous Euler,’ one biographer declared. (Euler, the most prolific mathematician of all – the constant e takes his initial – was reputedly able to recite the Aeneid from memory.) Poincaré read with incredible speed, and his spatial memory was such that he could remember the exact page and line of a book where any particular statement had been made. His auditory memory was just as well developed, perhaps owing to his poor eyesight. In school, he was able to sit back and absorb lectures without taking notes despite being unable to see the blackboard.
It is the connection between memory and creativity, perhaps, which should make us most wary of the web. ‘As our use of the web makes it harder for us to lock information into our biological memory, we’re forced to rely more and more on the net’s capacious and easily searchable artificial memory,’ Carr observes. But conscious manipulation of externally stored information is not enough to yield the deepest of creative breakthroughs: this is what the example of Poincaré suggests. Human memory, unlike machine memory, is dynamic. Through some process we only crudely understand – Poincaré himself saw it as the collision and locking together of ideas into stable combinations – novel patterns are unconsciously detected, novel analogies discovered. And this is the process that Google, by seducing us into using it as a memory prosthesis, threatens to subvert.
Those are the two types of information overload, says Nicholas Carr. Further, he argues that better filters make ambient overload worse. A nice one!
Situational overload is the needle-in-the-haystack problem: You need a particular piece of information – in order to answer a question of one sort or another – and that piece of information is buried in a bunch of other pieces of information. The challenge is to pinpoint the required information, to extract the needle from the haystack, and to do it as quickly as possible. Filters have always been pretty effective at solving the problem of situational overload. The introduction of indexes and concordances – made possible by the earlier invention of alphabetization – helped solve the problem with books. Card catalogues and the Dewey decimal system helped solve the problem with libraries. Train and boat schedules helped solve the problem with transport. The Reader’s Guide to Periodicals helped solve the problem with magazines. And search engines and other computerized navigational and organizational tools have helped solve the problem with online databases.
Whenever a new information medium comes along, we tend to quickly develop good filtering tools that enable us to sort and search the contents of the medium. That’s as true today as it’s ever been. In general, I think you could make a good case that, even though the amount of information available to us has exploded in recent years, the problem of situational overload has continued to abate. Yes, there are still frustrating moments when our filters give us the hay instead of the needle, but for most questions most of the time, search engines and other digital filters, or software-based, human-powered filters like email or Twitter, are able to serve up good answers in an eyeblink or two.
Situational overload is not the problem. When we complain about information overload, what we’re usually complaining about is ambient overload. This is an altogether different beast. Ambient overload doesn’t involve needles in haystacks. It involves haystack-sized piles of needles. We experience ambient overload when we’re surrounded by so much information that is of immediate interest to us that we feel overwhelmed by the neverending pressure of trying to keep up with it all. We keep clicking links, keep hitting the refresh key, keep opening new tabs, keep checking email in-boxes and RSS feeds, keep scanning Amazon and Netflix recommendations – and yet the pile of interesting information never shrinks.
Have a look!
Two very interesting pieces.
The shift in our view of memory is yet another manifestation of our acceptance of the metaphor that portrays the brain as a computer. If biological memory functions like a hard drive, storing bits of data in fixed locations and serving them up as inputs to the brain’s calculations, then offloading that storage capacity to the Web is not just possible but, as Thompson and Brooks argue, liberating. It provides us with a much more capacious memory while clearing out space in our brains for more valuable and even “more human” computations. The analogy has a simplicity that makes it compelling, and it certainly seems more “scientific” than the suggestion that our memory is like a book of pressed flowers or the honey in a beehive’s comb. But there’s a problem with our new, post-Internet conception of human memory. It’s wrong.
It is a truth universally acknowledged that education is the key to economic success. Everyone knows that the jobs of the future will require ever higher levels of skill. That’s why, in an appearance Friday with former Florida Gov. Jeb Bush, President Obama declared that “If we want more good news on the jobs front then we’ve got to make more investments in education.”
But what everyone knows is wrong.
Take a look!
There is nothing like formatting a hard disk, and do a clean installation of Linux. And, Ubuntu is such a pleasure to load. Of course, I have come a long way from night outs to get the X up and running; at the same time, the Linux distributions also have come a long way with nice GUIs and easy installation procedures. It just takes a few hours to have the machine loaded with all your favourites — gsl, gcc, latex, gnuplot, octave, scilab, and so on. After that, of course you spend the rest of the time writing blog posts like this
NB: I loaded Ubuntu 10.04.1 on my Sony VAIO VPCEA36FG laptop; I found this site very helpful to get the touchpad and sound working.
Along comes Google, the pirate. It uses a new kind of ship, swift and agile that can dart among the lumbering Microsoft galleons. The galleons have three and four rows of heavy cannon, but they are too ungainly to turn, and the big guns, so effective when attacking a fixed fortress, can hit nothing (Apple is far over the horizon during all this, in a high-tech racing sailboat. It’s just one boat, but it flies).
There are some really interesting pieces on information in there:
Microsoft could have taken .NET and created a thin layer of glue onto the hardware, and come up with a really good, robust, and revolutionary OS. When I mentioned this in a previous blog, a commenter who apparently worked within Microsoft research said that a project like that had been going on and some pieces of it might eventually appear in Windows. Of course, it could never be put out there as a real OS because it would compete with Windows and someone has too much turf power for that to happen.
Contrast this with Chrome OS. It’s been pointed out that Android is also being used as an OS, so gasp! There are two potentially competing OSes from a single company! Google shrugs and says “yeah, there might be some overlap, but they were designed with different goals in mind so we’ll just see what happens.” Something that would cause major political battles within Microsoft produces indifference within Google.
And, the piece is full of analogies; I will quote the one with which Eckel ends his piece:
I don’t see how Microsoft can change. What you’ve got is one of those nets in the jungle (think “Lost” here) which springs up and traps people into a hanging ball of bodies. Take one of those nets and fill it with Microsoft VPs. The net is constantly pressing them together as they struggle. No one can see that the net itself is an arbitrary constraint, because it presses everyone into a zero-sum game. Google comes wandering through the jungle, whistling, notices for a moment the ball of VPs fighting among themselves, and wanders on.
Unfortunately, the only way to fix the problem is for someone to come along and cut through the net, while everyone inside is screaming “Don’t cut it! We’ll fall!” And of course there would be a fair number of bruises, sprains and some broken bones. (Important note to Microsoft: I now do management consulting, although prepare yourself for truly outrageous fees, payable in advance. I can definitely come in and fix your company).
Take a look!