Adonis Diaries

Archive for the ‘professional articles’ Category

Only 12 words to define Entrepreneurship?

BILL MURPHY JR updated on April 24, 2014

There’s a definition of entrepreneurship that has changed how I think about the way people choose their paths in life. It helped me to build a thriving business and find all kinds of great new experiences. Heck, it even helped me to meet my wife.

I believe it can have the same kind of positive impact for you, if you’re willing to try to put it into practice:

Entrepreneurship is the pursuit of opportunity without regard to resources currently controlled.

That’s the 12-word definition of entrepreneurship that they teach at Harvard Business School.

I first read it while researching my 2010 book, The Intelligent Entrepreneur

I remember staring at it on the page and feeling like a boy noticing girls for the first time: There’s something really interesting here, but I know there’s a lot more to it than I currently understand.

I’d like to break the definition down for you, because it not only gives insight into why people like you are so drawn to the idea of starting and building something, it will also improve the likelihood that you’ll be successful.

(As a quick aside, seeing that definition in another of my books is what originally led me to meet Inc.’s editor-in-chief, Eric Schurenberg. A column he wrote about it became the most-read article in the history of Inc.com at that time.)

1. “Entrepreneurship…”

Let’s start with the word itself: Entrepreneurship. A noun with few true synonyms. (that lack of real synonyms can be a real pain in the neck.) It’s not simply a matter of being a boss or a leader or owning a business. In fact, there’s nothing intrinsic at all in this definition about business, or risk, or even making money. It’s something different–a way of looking at the world.

2. “…is the pursuit of opportunity…”

There are two key words here: pursuit and opportunity.

“Pursuit” means there has to be action involved (hence, my reader-inspired decision this year to change the name of my column to Action Required). You have to have impact; you have to try to change something. Simply thinking about an idea doesn’t cut it, and neither does coasting along doing what you’ve always done.

Similarly, a true entrepreneur is always pursuing “opportunity.” That means something new, bigger, nicer, better, smarter, more useful.

it often also means pursuing the most amazing, appealing, enticing opportunities you can find.

Here’s where we really start to differentiate true entrepreneurs from everyone else.

There are a lot of good people out there running very nice businesses. However, if they’re not chasing new opportunities–if they’re coasting along, doing what they’ve always done–then maybe they’ve given up the mantle of true entrepreneurship.

3. “…without regard to resources currently controlled.”

This might just be my favorite phrase in the world. I suppose if Harvard Business School had wanted to make the definition more accessible, they could have said “regardless of” instead of “without regard to,” but no matter.

“Without regard to resources currently controlled” means it doesn’t matter how little you have at the start. It doesn’t matter that you don’t have money, or that you don’t have all the required skills, or that you don’t have a team to help you.

At the very beginning especially, reach for the stars. Don’t let the opportunities you pursue be limited by the assets you currently have. Instead, let the attractiveness of the opportunity serve as your guide.

There are so many implications of this part of the definition.

For one thing, while capital is a necessary ingredient, the truth is that all of those would-be entrepreneurs out there who blame a lack of money for their inability to get started are playing the wrong game.

there’s an advantage to not having money at the start, because that scarcity forces you to be more resourceful. It means you have to sell your ideas to others–a possibly painful exercise, but one that pays huge dividends in the long run.

Here’s the bottom line: For just about any decision you have to make in life, there are two ways to make choices.

Most people choose the first method of decision making. They look at the array of options that seem reasonably attainable, and then pick the best one. They choose a career because it’s what their parents advised, or because there are jobs available. They live somewhere because it’s what they’re familiar with. They surround themselves with the kinds of people they’ve always known.

The true entrepreneur, however, sees things differently.

Instead of choosing the best available option, he or she thinks big, and tries to identify the best possible solution, regardless of whether it seems completely implausible and unattainable. Then, he or she gets to work, trying to make that impossible dream a reality.

If you choose the first path, you might save yourself a lot of heartache, and a lot of ups and downs on the roller coaster of life. However, you also run a greater risk of achieving your goals only to find you didn’t push yourself enough. Which path will you choose?

BILL MURPHY JR. is a journalist, ghostwriter, and entrepreneur. He is the author ofBreakthrough Entrepreneurship (with Jon Burgstone) and is a former reporter for The Washington Post. @BillMurphyJr

Advertisements

History Of Graphic Design: In Icons

We know two things for sure about the guys over at Brooklyn’s Pop Chart Lab: they love drinking, and they love good graphic design.

Pascal Zoghbi posted this link on May 3, 2014 via FB:
The History Of Graphic Design, In Icons http://t.co/f37hgSLUEx

Their latest poster is a tribute to the entire history of the latter: The gridded, black-and-white poster is a cheat sheet to the history of graphic design, beginning with the Victorian era.

Start at the top, left-hand corner, of A Stylistic Survey of Graphic Design, and read from left to right.

Each era (say, Arts & Crafts or Art Nouveau) is represented by a rectangular box that includes several squares that graphically represent the style described.

The Modern movement, one of the largest movements depicted here, includes Bauhaus, Vorticism, De Stijl, New Typography and Istotope, Constructivism, Suprematicsm, and Futurism.

Pop Chart creates, within each stamp-sized box, a visual representation of that particular style, with the design elements that prevailed at the time.

So the Constructivism box echoes the intense Soviet Party posters from the 1920s, the Futurism box has a bold, attention-grabbing arrow on it, and so on.

It’s telling that certain eras–eras that were niche or short-lived, or which are still emerging–get just one box. (This includes Dada, Digital, and Street Art/Guerrilla.)

Scan down to the bottom for a sampling of today’s reigning design philosophies. Are they right?

There’s data visualization, there’s the twee, chalkboard-loving school of handcrafted, and there’s flat design.

But where’s skeuomorphism?

Each box is efficiently packed, providing an at-a-glance answer to any designer who might ask: What, again, were the defining elements of the Late Modern Polish School era? For the rest of us, it’s just nice to look at.

Pre-order A Stylistic Survey of Graphic Design for an early bird price of $23, here.

[Image: Courtesy of Pop Chart Labs]

MARGARET RHODES

Margaret Rhodes is an associate editor for Fast Company magazine, where she produces Wanted …

 

Is better possible?

The answer to this is so obvious to me that it took me a while to realize that many people are far more comfortable with ‘no’.

The easiest and safest thing to do is accept what you’ve been ‘given’, to assume that you are unchangeable, and the cards you’ve been dealt are all that are available.

When you assume this, all the responsibility for outcomes disappears, and you can relax.

When I meet people who proudly tell me that they don’t read (their term) “self-help” books because they are fully set, I’m surprised.

First, because all help is self help (except, perhaps, for open heart surgery and the person at the makeup counter at Bloomingdales). But even this sort of help requires that you show up for it.

Mostly, I’m surprised because there’s just so much evidence to the contrary.

Fear, once again fear, is the driving force here.

If you accept the results you’ve gotten before, if you hold on to them tightly, then you never have to face the fear of the void, of losing what you’ve got, of trading in your success for your failure.

And if you want to do this to yourself, this is your choice.

But don’t do it to others. Don’t do it to your kids, or your students, or your co-workers.

Don’t do it to the people in under-privileged neighborhoods or entire countries.

Better might be difficult, better might involve overcoming unfair barriers, but better is definitely possible. And the belief that it’s possible is a gift.

We owe everyone around us not just the strongest foundation we can afford to offer, but also the optimism that they can reach a little higher.

To write off people because you don’t think getting better is comfortable enough is sad indeed.

Better is a dream worth dreaming.

Wishing vs. doing. Line or staff?

By giving people more ways to speak up and more tools to take action, we keep decreasing the gap between what we wish for and what we can do about it.

If you’re not willing to do anything about it, best not to waste the energy wishing about it.

Line or staff?

The most urgent jobs tend to be line jobs. Profit and loss. Schedules to be drawn and honored. Projects to deliver.

The line manager initiates. The line manager delivers.

Staff jobs are important, no doubt about it.

The staff keeps the lights on, provides resources on demand and is standing by ready to help the line manager. But the staff person doesn’t get to say yes and doesn’t get to say go.

In fact, the best staff people get that way by acting like they’re on the line.

When you can, take responsibility. Say go.

Observations in the game of Petanque (boules)

I have been playing an observing people play Petanque (Boules).

Many are dissatisfied with my recommendations and comments, and are outright condemning my interference, since they consider themselves Pros without the backing of any performance statistics.

1. Le pointeur devient meilleur tireur que celui qui “se defoue” aux tires forts
2. Ceux qui se “designent” tireurs et qui ne pointent pas ont tendance a faire perdre le match au team
3. Le tireur perd ses boules au debut et se revelle conseiller de “comment pointe'” et Ou pointe”
4. Les tirs forts (macho) peuvent impressioner les debutants, mais ne gagnent pas la partie.

5. Les tirs “doux” on les avantages de faire “carreau” plus frequement  et de conserver sa boule sur le terrain

6. The club that refuses to take statistics performance fails in competitions, invariably

7. Almost all beginners consider themselves volunteered professionals in how to play the game and their proper techniques

You have a choice of a wide rage of suggestions on how to play. Le terrain dicte ta technique pour la trajectoire de la boule (tres haut ou bas cercle). My set of suggestions (Conseils) pour mieux pointer:

  1. Regarde ou tu veut que la boule tombe, selon ta method de trajectoire
  2. Fait attention a la direction de ton poigne’, sinon la boule prend la direction naturelle de ton poigne’
  3. Utilise ton poigne’ et ton coude, jamais l’ epaule
  4. Vous avey 30 secondes pour pointer par regulation: prend ton temps et que l’autre team s’ agite

Conseils pour tirer:

  1. De meme que pour pointer, mais regarde l’ endroit de la boule a tirer, comme au billard
  2. Retirer le bras en quart de cercle a l’ arriere du dos pour donner un leger effect
  3. Les genoux un peu baisses pour plus d’ equilibre et de flexibilite’: Pour eviter d’ utiliser les muscles du dos ou l’ epaule.
  4. Go “Cool”. Ma badha kel hal tentee3

Note: In a previous article, I suggested criteria for Petanque Performance statistics and a few rules and regulations https://adonis49.wordpress.com/2017/08/24/measuring-petanque-performance-which-club-took-this-important-step/

Start your project with a clean sheet of paper: It forces boundaries

A clean sheet of paper still has edges.

It’s tempting to believe that creativity comes from starting fresh.

But even when we start fresh, we approach projects and problems with self-created boundaries.

You can’t do real work without edges, without something to leverage, but those edges don’t have to be the same edges as everyone else uses.

Creative people often excel because they change the shape of the clean sheet.

Does the human eye prove that God exists?

Darwin was baffled by it; Christians see it as evidence of the divine. Will science ever unlock the secrets of the human eye?

When the body of Dr Yoshiki Sasai, an eminent Japanese biologist, was discovered in August this year, his death was widely mourned across the world of science.

Not just for the abrupt end to his glittering career – one which had seen him win several awards, including the 2010 Osaka Science Prize, and become the laureate of the 2012 Inoue Prize for Science.

Nor because of the tragic manner of his death: the 52-year-old was found hanged in his own laboratory – an apparent suicide after a scandal over a research paper he’d co-authored in January.

A close-up of the human eye

Flawless: a close-up of the human eye Photo: Dimitri Vervitsiotis/Getty
Chris Bell posted this  24 Sep 2014

Instead, the scientific world lamented what, perhaps, Dr Sasai was about to achieve.

As one of the directors at the RIKEN Center for Developmental Biology in Kobe, he was one of the world’s leading experts in stem cell technology. His team had pioneered incredible new techniques for creating organ-like structures – making giant strides towards a future where replacements for our failing human organs could be grown in a Petri dish.

And most tragically, the months before his death had heralded Sasai’s biggest achievement.

His team had already grown partial pituitary glands and even bits of the brain, but now he’d coaxed embryonic stem cells into forming the functioning tissue of arguably the most complex and scrutinised organ in the entire animal kingdom. Sasai had grown an eye.

And in doing so, he’d also helped resolve a scientific obsession that had lasted centuries.

In very basic form, the eye is thought to have first developed in animals around 550 million years ago.

But such is its perfect design – its infinite adaptability, and irreducible complexity – that many argue it is proof of the divine itself.

Darwin remarked that the whole idea of something so flawless “could have been formed by natural selection, seems, I freely confess, absurd in the highest degree.”

The eye has become a focal point for biologists, ophthalmologists, physicists and many other branches of science ever since. So when the Spanish neuroscientist Santiago Ramón y Cajal made the first anatomical diagrams of neurons and the retina in 1900, it stoked a century of biologists attempting to unlock the eye’s secrets.

And there have been several discoveries. Unlike our ears and nose, for example, which never stop growing our entire lives, our eyes remain the same size from birth.

Then there’s the complicated process of irrigation, lubrication, cleaning and protection that happens every time we blink – an average of 4,200,000 times a year.

Dr Yoshiki Sasai, the late Japanese biologist who was building a human eye in his lab (Dimitri Vervitsiotis/Getty)

And there are other astonishing inbuilt systems too.

Take, for example, a little trick called the Vestibulo-ocular reflex (VOR). In short, it’s our own personal Steadicam – an inbuilt muscular response that stabilises everything we see, by making tiny imperceptible eye movements in the opposite direction to where our head is moving.

Without VOR, any attempts at walking, running – even the minuscule head tremors you make while you read these words – would make our vision blurred, scattered and impossible to comprehend.

But while the inner workings of the eye continue to surprise scientists, the last decade has seen an unprecedented confluence of biology, technology and ophthalmic innovation. An international scientific endeavour that is not only finally unlocking the eye’s true potential – but also how to counter, and ultimately cure, its biggest weaknesses.

One scientist leading the charge is Professor Chris Hammond, the Frost Chair of Ophthalmology at King’s College London. “I’ve been working in ophthalmology for nearly 25 years,” he says. “And I think we’re at a key moment. The pace of our genetic understanding, cell-based therapies and artificial devices for the treatment of eye disease is advancing faster than ever.”

His personal crusade – treating common conditions such as myopia, cataracts and glaucoma, as well as eye diseases – is, he says, slowly becoming possible.

“For example, we’re finally starting to understand some of the mechanism of these diseases – how genetic and environmental risk factors, and not ageing, might be significant. And with some of the rarer diseases, we’re starting to look at actual cures.

“We are also understanding more and more about the processing that is already being done within the retina, before signals are sent to the brain. And with the amazing abilities we have today for imaging, the emerging technologies are exciting too.”

With much fanfare, the first bionic eye debuted last year.

Developed by Second Sight Medical Products, the Argus II Retinal Prosthesis System consists of 60 electrodes implanted in the retina, and glasses fitted with a special mini-camera. Costing €73,000 (£58,000) to install, it then sends images – albeit very low-resolution shapes – to the user’s brain. Which means people with degenerative diseases such as retinitis pigmentosa can differentiate between light and dark, or make out basic shapes such as doorways.

“In terms of devices like these, we are still at the very crude technology stage,” says Prof Hammond. “They’re only really of use to people who are completely blind. But the thing about technology is that it evolves with amazing speed.”

Less invasive, “wearable” optic gadgetry is catching up fast.

Although still in its infancy, the ability to mount microelectronics within a contact lens is already offering huge potential. Take the Sensimed Triggerfish, for example – a curiously-named soft, disposable silicone lens with a micro-sensor that continuously monitors the shape and pressure of your eyeball, ideal for monitoring the progress of treatment or post-surgical health.

Other lenses are coming on the market too.

In January this year Google announced a lens that tests the level of glucose in the tears of diabetes sufferers, eliminating the finger prick test commonly used several times each day by many diabetics. Others are planned that actually secrete precise dosages of drugs continuously into your system via your eye – even when you’re asleep.

And then the barrier between technology and sci-fi begins to blur.

Already, millions of tiny miniaturised telescopes, known as intraocular lenses, are implanted in patients’ eyes following cataract surgery, to help with the focusing of light into the eye. But the launch of Glass, Google’s web-enhanced spectacles, has prompted research into mounting microelectronic elements onto the polymer of a contact lens itself.

Already mooted, again by the Google X development lab, is a contact lens camera – a boon for, if no one else, the paparazzi.

With enough digital storage capacity, we could record our entire visual experience in real time. But the new “wonder-material” graphene offers greater potential.

As University of Maryland researchers announced in early September, graphene’s broad wavelength sensitivity enables it to detect light frequencies 100 times broader than the normal visible spectrum. And when incorporated into a contact lens, it could allow the wearer to see ultraviolet and infra-red light.

And other scientists are working on mounting suitable optical elements to project information directly into your eyeball, like fighter-jet style “head-up display”.

A team at the University of Washington debuted a bionic contact lens with a single-pixel display in 2011;

by 2012 the display had increased to a whopping eight pixels. If we end up being able to project images and even videos directly into your eye, you may never have to leave the house for a business meeting or theatre production again.

If all this feels a bit like the futuristic Tom Cruise film Minority Report, then think again – because, well, aspects of that are already happening. Thanks to New York company Eyelock, the concept of scanning a person’s iris from afar for ID purposes is now a reality. As Jeff Carter, Eyelock’s Chief Technology Officer, explains: “Today your identity can be determined from across the room while you’re at a full run – even if you’re wearing a mask, or a wig, or sunglasses – to within a one-in-a-quadrillion certainty that you are who you say you are.”

The Eyelock works by photographing your eyes using a high-resolution camera, then combines 240 unique points on each iris to generate an encrypted code. “To authenticate your ID, our technology matches the code with your eyes,” says Carter confidently. “It’s roughly 2,000 times more powerful than a fingerprint. Only DNA is more accurate.”

The individual uniqueness of each eye’s iris – the pattern of lines, dots and colours that surround the pupil – was first noted by Hippocrates in 390 BC.

Even today, its infinite complexity still compels our interest. The plot of writer/director Mike Cahill’s new sci-fi film I Origins, for example, follows a biologist attempting to find an identical pair – and how his discovery has implications for his scientific and spiritual beliefs.

But it’s only with our modern concerns over security, access and identify fraud that the iris’ potential for a foolproof identification system has been realised. Already, for example, over half of India’s population have had their irises scanned as part of a groundbreaking nationwide identity scheme known as UIDAI.

“This year a report by Intel Security estimated the annual worldwide cost of cybercrime to be more than $445 billion,” says Carter. “But [iris scanning] could mean no more credit cards, no more driver’s license, no more passports, no more user IDs or passwords… our everyday lives can be made simpler, better, more seamless and secure.”

And now scientists are delving deeper into the eye than ever before. One widely held belief for decades was that the eye was just a basic, dumb camera. That light would hit the retina (the light-sensitive layer of tissue lining the inner surface of the eye), and electrical signals would then be swiftly transmitted back to the brain where all the heavy visual processing took place.

Science and spirituality meet in the new film I Origins

It’s only in the last few years that researchers have discovered the retina is doing a huge amount of pre-processing itself – and that as light passes through the retina’s several dense layers of neurons, a lot of detail like colour, motion, orientation and brightness are determined.

And so a team from the Massachusetts Institute of Technology (MIT) have started work on a formidable task: intricately mapping this vast network of millions upon millions of neurons to see how they connect and process visual information.

“A huge amount is known about optics and the muscles around the eyes,” says Claire O’Connell, an MIT fellow on the project. “But the retina is the great unknown territory. It’s one of the most complex tissues in the human body.”

And that was the problem: with retinal tissue resembling, well, extremely tangled spaghetti, much of this neuron mapping proved too complex for computers, and had to be done manually – a task estimated to take upwards of 15 years. And so, the team hit on an unusual solution: they made it into a game.

In December 2012, Eyewire was launched – a web-based puzzle game that now boasts over 120,000 players from 150 countries.

“Believe it or not, it was inspired by Angry Birds,” says O’Connell, who helped design Eyewire. “We wondered if the thousands of hours people put into games like that could be used to crowd-source how the retina works at a cellular level. And it turns out it can.”

So now, instead of killing pigs with a deftly placed parrot, players can register at eyewire.org for a different kind of challenge – examining 3D electron microscope scans of neural matter and tracing the path of neurons within it. A few clicks later, an entire neuron, plus its connections, can be identified. And better still, no medical knowledge is required. “There’s a regular player we have called Crazyman,” says Claire. “He’s 16 and from Bulgaria, and he sometimes spends 23 hours in a row helping us in this quest – it’s awesome!

“The game as a whole has been a huge success. Mapping out the precise synaptic connections from one cell type to another would take us two weeks in the lab, but now we can do it in a day. Already, Eyewire has identified the areas responsible for motion detection. The more we discover about the eye, the more amazing it becomes.”

And now scientists are stood at a new threshold: the creation of a biological eye itself, that most complex of all bodily organs. Despite the untimely death of Dr Sasai, his colleagues at the RIKEN Center for Developmental Biology announced a new scientific first on September 12: the successful implantation of new retinal tissue, grown for the first time from stem cells, into the eyes of a Japanese woman in her 70s suffering from encroaching blindness.

It could prove to be the first step in eradicating loss of sight in humans for good. But restraint is imperative, cautions Prof Hammond. “The hope is blindness will be a thing of the past in a few years’ time – but we have to be careful about overstating what we can do,” he says.

“The teams in Japan have done fantastic work which holds great promise in terms of creating replacement cells,” he says. “The big problem, however, is how we connect the eye to the brain, and to the relevant pathways in the brain. From that point of view, we’re still in very early days.”

But one thing is certain: in terms of solving the eternal mystery of the eye, and curing the frailties that its infinite complexities present, we have never been more focused. And the future, once dim, is significantly brighter – a sentiment that Dr Sasai echoed in one of his final interviews before his death. “We really don’t know where we are going with this,” he said then. “We really are at the final frontier, facing an unknown world.”

I Origins opens the Raindance Film Festival on September 24, and is on general release from September 26

01 Nov 2013


adonis49

adonis49

adonis49

November 2017
M T W T F S S
« Oct    
 12345
6789101112
13141516171819
20212223242526
27282930  

Blog Stats

  • 1,026,814 hits

Enter your email address to subscribe to this blog and receive notifications of new posts by email.adonisbouh@gmail.com

Join 502 other followers

%d bloggers like this: