Friday, December 29, 2017

How to Choose Optimism #1 Cultivate Positive Thinking


It’s better for our health to seek positive emotions – affection, joy, satisfaction. The links between the heart and brain are well established. A single positive thought can trigger beneficial neurotransmitters and hormones. Oxytocin is the hormone of love, pleasure and orgasm. Serotonin regulates our mood positively. Dopamine stimulates and encourages us. A thought, look or smile is enough to bring down our blood pressure, and make us feel better.

Test these scientific principles. When you wake up, dwell a moment on a dream, or something pleasant. When you arrive at work, forget your dreadful journey or the bad weather. Share something positive. When a friend or college look at you, smile and say hi or wave. In these moments, you’ll feel your face relax and a good mood take hold.

Lord, Give Us Today Our Daily Idea(s)


Saturday, June 10, 2017

Essential Thinkers #24 David Hume, Hero of Modern Day Sceptics and and Empiricists


David Hume (1711-1776) is the philosophical hero of modern day sceptics and empiricists, renouncing all knowledge except for that which can be gained from the senses. Alas, as W.V.O. Quine would later famously say, echoing Hume, what can be granted from the senses is, after all, not much.

From John Locke, Hume drew the conclusion that all human knowledge is based on relations amongst ideas, or ‘sense impressions.’ Anything not given in experience is mere invention and must be ruthlessly discarded. As a result he denies the existence of God, the self, the objective existence of logical necessity, causation, and even the validity of inductive knowledge itself. His aim is twofold: at once demolitionnary – to rid science of all falsehoods based on ‘invention rather than experience’ – and constructive, to found a science of human nature.

Much impressed with how Isaac Newton had described the physical world according to simple mechanical laws, Hume had a mind to do something similar for the nature of human understanding. His Treatise on Human Nature is a painstaking study in experiential psychology in search of general principles. In this Hume can be seen as having failed spectacularly, primarily because his whole taxonomy of ‘impressions’ and ‘ideas’ is derived from the much discredited Cartesian model. Nevertheless, Hume’s negative program is a devastating example of the power of logical critique. His sceptical results, especially regarding induction, remain a problem for modern philosophers.

Hume observes that we never experience our own self only the continuous chain of our experiences themselves. This psychological fact leads Hume to the dubious metaphysical conclusion that the self is an illusion, and that in fact personal identity is nothing but the continuous succession of perceptual experience. “I am,” Hume famously says, “nothing but a bundle of perceptions.

Following a similar line of thought, Hume notices that the force that compels one event to follow another, causation, is also never experienced in sense impressions. All that is given in experience is the regular succession of one kind of event followed by another. But the supposition that the earlier event, the so-called ‘cause,’ must be followed by the succeeding event, the ‘effect,’ is merely human expectation projected onto reality. There is no justification for believing that there is any causal necessity in the ordering of events.

Hume’s scepticism does not stop there. He regards human belief in causation as just a special case of a more general psychological trait: inductive reasoning. Inductive reasoning is the process that leads us to make generalisations from observing a number of similar cases (remember frictional character Sherlock Holmes?). For example, having observed many white swans but no black swans, one might seemingly be justified in the conclusion that “All swans are white.” Equally, being aware that men often die, we conclude “All men are mortal.”

But such generalisations go beyond what is given in experience and are not logically justified. After all, black swans were found in Australia, and there is always the logical possibility of coming across an immortal man.

Hume claimed that inductive reasoning could not be relied upon to lead us to the truth, for observing a regularity does not rule out the possibility that next time something different will occur.

Since all scientific laws are merely generalisations from inductive reasoning, this so-called ‘problem of induction’ has been an urgent one for philosophers of science. Trying to show how induction is justified has taxed them throughout the 20th century. Karl Popper is notable for offering the most promising solution to Humean scepticism.
[Summarized from Philosophy 100 Essential Thinkers by Philip Stokes, 2012.]


Lord, Give Us Today Our Daily Idea(s)

Sunday, June 4, 2017

I Wonder #10 How Can I Fool A Lie Detector?


The polygraph test, still used widely in the US, measures your heart rate, breathing and blood pressure as a way to tell how stressed you are feeling. The idea is that the interrogator asks you questions and when you lie, you get more stressed than when you tell the truth, and the difference is revealed in the physiological measures. A simple way to cheat the polygraph is to deliberately distort your physiological readings when telling the truth, such as by biting your tongue, or imagining an embarrassing incident in the past. Similar problems afflict brain scan lie detectors, which follow the same principle of needing a reliable baseline against which to compare signs of lying.
[By Dr. Christian Jarrett, BBC Earth. Asia Edition/Vol.9 Issue 1]

Lord, Give Us Today Our Daily Idea(s)

Wednesday, May 24, 2017

Essential Thinkers #23 John Locke, the Empiricist, on the Nature of Human Understanding


In his day, John Locke (1632-1704) was an important political figure and author of the liberal exposition Two Treatises of Government. An associate of the Earl of Shaftesbury, Locke spent time in exile in Holland, returning to England after the ‘Glorious Revolution’ of 1688. It is for his views on the nature of human knowledge, however, in his Essay Concerning Human Understanding that he is remembered in modern philosophy. 20 years in the writing, the book was to exert such an influence on the next 100 years of Western though that its author is considered by many to be the greatest British philosopher of all time. The works of George Berkeley, Immanuel Kant, and David Hume are all direct successors of Locke’s Essay.

The subject of Locke’s Essay, as given in the title, is the nature of human understanding, that is, the very way in which the human mind collects, organises, classifies and ultimately makes judgements based on data received through the senses. Greatly influenced by the scientific turn of his day, and a personal friend of two renowned contemporary scientists, Robert Boyle and Isaac Newton, Locke’s intent was to set the foundations of human knowledge on a sound scientific footing. He had read with great interest Rene Descartes’ Meditations on First Philosophy, but rejected the rationalist philosophy that underpinned its conclusions.

For Locke, there could be no innate knowledge: rather, everything we know must be derived from experience, through the actions of the physical world on our sense organs. This is the view now known as empiricism, a view still central, in essence if not detail, to the philosophies of W.V.O. Quine and other modern thinkers. Locke’s detractors, the Rationalists (Rene Descartes, George Berkeley, Gottfried von Leibniz) with whom the Empiricists battled for ideological supremacy throughout the 17th and 18th centuries, have their modern counterparts in the supporters of Noam Chomsky and his philosophy of innate, or generative, grammar.

Locke states that the mind at birth is like a blank slate, or tabula rasa, waiting to be written on by the world of experience. All human knowledge is derived from ideas presented to the mind by the world of experience. However, these ideas can be classified into two general sorts. There are complex ideas and simple ideas. Simple ideas are the immediate products of sensory stimulation, examples would be ‘yellow,’ ‘bitter,’ ‘round,’ ‘hard,’ and so on. Complex ideas are constrictions out of simple ideas, and are the product of internal mental operations. These include all our ideas of familiar material objects, such as tables, chairs, cats, dogs and horses. But complex ideas need not represent anything real in the world. This accounts for ideas like that of a unicorn, a complex idea itself made up of other complex ideas, such as ‘horse’ and ‘horn.’

Among Locke’s simple ideas is a distinction between those that are primary qualities of objects and others that are secondary qualities. The distinction divides those qualities thought to be essential and inherent to all objects and those that are apparent only on account of the effect objects have on our sense. Primary qualities are those such as solidity, extension, shape, motion or rest, and number. Secondary qualities are those such as colour, scent and taste. These are secondary because, according to Locke, they do not inhere in objects themselves, but are causally produced only in our minds by the effect of an object’s primary qualities upon our senses. Another way of conceiving them is to say primary qualities are objective (really exist) and secondary ones subjective (only exist in the minds of observers).

In the popular conundrum of whether a falling tree makes a sound when there is no one to hear it, Locke’s view would be that the falling tree creates vibrations in the air, but that there is no ‘sound’ strictly speaking, since sound is not a ‘real’ or primary quality. This view, sometimes called ‘scientific essentialism,’ leads to the metaphysical conclusion, plausible to many modern thinkers, that without a perceiving mind, there is no such thing in the world as colour or sound, sweet or sour and so on; but there are really such things as shape, extension and solidity, independently of whether anyone perceives them or not.
[Summarized from Philosophy 100 Essential Thinkers by Philip Stokes, 2012.]

Lord, Give Us Today Our Daily Idea(s)


Wednesday, May 17, 2017

7 Climate Facts You Need to Know #6 Wildlife is Already Hurting

Climate change spells trouble for far more than just the Arctic’s iconic predator, the polar bear. In 2016 scientists announced that the last Bramble Cay melomys, a ratlike rodent found on one low-lying island in Australia’s Torres Strait, had vanished, the victim of forces including rising seas. It’s being called the first documented case of mammal being driven to extinction by climate change. More will surely follow.

Rising temperatures are depressing some plant and animal populations, driving species toward the poles, shifting migrations and behaviour. Populations of Adelie penguins on the Antarctic Peninsula have plumed. An Arctic shorebird called the red knot is getting smaller. Ice loss is forcing walruses by the thousands onto land in Alaska. Entire regions are being transformed: Alpine ecosystems from the Rockies to the Swiss Alps are being squeezed off mountaintops. The exceptional ocean warmth of the past few years has triggered coral bleaching and die-offs at reefs around the world.

There will be winners. For now, humpback whales are striving in newly ice-free waters off Antarctica. Sea urchins too are proving to be resilient. But climate change isn’t the only threat that spreading human populations impose on other species; we’re also fragmenting and destroying natural habitats. Some species will adapt to the jarring changes in their world – but how many, and for how long?
(Summarized from National Geography Magazine, April 2017)

Verdict: Plants, animals and natural habitats are poorly affected by climate change
Lord, Give Us Today Our Daily Idea(s)

7 Climate Facts You Need to Know #5 Weather is Getting Intense


In the crapshoot that is our weather, climate change loads the dice. It doesn’t cause a particular drought or storm, but it makes such events more or less likely – and in the case of heat waves, a lot more likely. The extraordinary heat wave that killed some 70,000 people in Europe in 2003 should have been a once-in-500-years event; at the current level of global warming, it has become a once-in-40-years event, according to a study published last year. In Paris alone, that analysis found, climate change caused 506 excess death in 2003. If it continues unchecked, another recent study said, by late this century people living along the Persian Gulf may face many days so hot that it will be unsafe to go outside.


It’s not just the heat: Global warming adds moisture to the air, removing it from land and ocean. Where rain is lacking, it makes the drought worse. When rain or snow falls, it’s more likely to be extreme; think of the 2016 floods in Paris or Houston. How climate change effects hurricanes and other tropical cyclones is less certain. But by heating the ocean – the storm’s energy source – it’s likely to make them more intense, if less frequent.
(Summarized from National Geography Magazine, April 2017)

Verdict: Most probably, climate change intense worldwide catastrophic events –
make it worse and frequently
Lord, Give Us Today Our Daily Idea(s)

Sunday, May 14, 2017

7 Climate Facts You Need to Know #4 Ice is Melting Fast


Last June-September 2016 the Crystal Serenity, a large cruise ship, sailed through the ice-free Northwest Passage. Days after it passed, researchers off King William Island found the long-lost wreck of H.M.S. Terror, of Britain’s Franklin expedition – which had gotten trapped in the ice in 1846 while searching for the passage. The Arctic has warmed dramatically, and its ice cover has thinned and shrunk (graph, below). That loss speeds the warming, as sunlight is absorbed by dark ocean instead of reflected into space by ice.

Click to Enlarge
Melting sea ice doesn’t raise sea level – it’s already in the water – but melting land ice does. Mountain glaciers are in global retreat. The total sea level rise of 8 to 9 inches since 1900 has contributed to a sharp increase in flooding along coasts. During Superstorm Sandy, for example, floods and winds cause $68 billion in damage on the U.S. East Coast.

The big treat is the ice sheets covering Greenland and Antarctica. The hold enough ice to raise seas more than 200 feet – and they’re losing it. When Earth was just a bit warmer, 125,000 years ago, they seem to have lost a lot: Sea levels were 20 to 30 feet higher. Such a rise today would swamp coastal cities.


How Fast Can Ice Sheets Fail?
Since 2002 Greenland has lost an average of 287 billion metric tons of ice a year, according to NASA satellites. Antarctica is losing less, but it’s vulnerable; much of the West Antarctica ice sheet sits on the seabed, and the floating ice shelves that buttress it are eroding in a warmer ocean – as the calving of a 44-square-mile iceberg into Pine Island Bay illustrates (above). A glacial collapse that would raise sea level several feet could take centuries. Or maybe just decades.
(Summarized from National Geography Magazine, April 2017)

Verdict: Ice is melting fast, so fast…
Lord, Give Us Today Our Daily Idea(s)


7 Climate Facts You Need to Know: #1-3 The World is Warming


#1 THE WORLD IS WARMING

The heat in 2016 broke the historic record set in 2015, which broke the one from 2014. Last year’s average global surface temperature, compiled from measurements made by thousands of weather stations, buoys, and ships, was 1.69 degree Fahrenheit warmer than the 20th-century average. Satellites probing the atmosphere also have documented a clear warming trend.

#2 IT’S BECAUSE OF US

El Nino added to last year’s record by temporarily releasing heat from the Pacific. But no natural cause explains the half-century warming trend. The sun’s output cycles up and down every 11 years; volcanic eruptions sporadically cool the planet. Meanwhile human-emitted greenhouse gases from a steadily thickening blanket that traps heat at Earth’s surface.

#3 WE’RE SURE

More than 9 out of 10 climate scientists agree; Carbon emissions cause global warming. We’ve known about the greenhouse effect since the 1800s. Swedish physicist Svante Arrthenius even predicted in 1896 that carbon dioxide from coal burning would warm the planet. He saw it as a good thing – and just how bad it will be is debatable. But it’s real, and it’s dangerous.
(Summarized from National Geography Magazine, April 2017)



Verdict: Climate change isn’t a hoax or a scientific conspiracy, it’s a grand challenge!
Lord, Give Us Today Our Daily Idea(s)

Saturday, May 13, 2017

Is Technology Changing Our Brains #8 Navigation Skills, Grey Matter and GPS


In 2000, a study found that taxi drivers who acquire The Knowledge – which requires memorizing thousands of London streets – have a greater volume of grey matter in the posterior hippocampus but less in the anterior hippocampus, making them better at memory tasks involving landmarks but poorer at recalling complex visual information. This provided evidence for plasticity in the adult human brain.

Could our reliance on GPS also be changing the way our brains work? Researchers from McGill University in Canada used MRI scans to compare GPS users with non-GPS users. Those who navigated without GPS had higher activity and a greater volume of grey matter in the hippocampus than those who relied on GPS.

In another study, people who drove a route using sat-nav could not remember scenes from the journey as well as those without sat-nav, and were poorer at retracing their steps from memory alone.

It’s possible that reliance on technology could cause some brain areas to grow and others to shrink,” says University College London’s Dr Sam Gilbert. “Something similar was shown in the original taxi driver study. But occasional use of sat-nav probably won’t have as strong an effect as learning The Knowledge and relying on it as part of your job.
(Summarized from BBC Earth Magazine (Vol.9 Issue 1), page 34 by Jo Carlowe)

Verdict: Yes, technology may change our brains when it comes to navigation
Lord, Give Us Today Our Daily Idea(s)

Is Technology Changing Our Brains #7 Better Multitasker at a Cost


Our ‘always-on’ culture has been dubbed ‘infomania’ by psychologist Dr Glenn Wilson, who tested the IQs of subjects in either a quiet room or one with mobiles ringing and emails arriving. The technological distractions diminished IQ by 10 points.

Similarly, a US study found that students who instant messaged with friends during a reading task took between 22% and 59% longer to complete their task, even accounting for the additional time spent messaging.

Brain-imaging reveals that multitasking uses different brain regions to focusing on one task. Learning while focusing on one task uses the hippocampus, which store ideas and creates rich and flexible memories. This area allows us to compare old ideas with incoming data to put what we learn into context, effectively leading to deeper understanding. Multitasking, on the other hand, uses the striatum – a brain region that stores procedures and skills. New information acquired using the striatum is less flexible and can’t be generalized in the same way. This suggest that knowledge acquired while multitasking is less deeply embedded in our memories.

Researchers from University College London recently linked frequent multitasking to smaller grey matter density in the anterior cingulate cortex (ACC), which is the brain region that is involved in empathy and decision-making. However, it is unclear whether having a smaller ACC makes you more likely to multitask, or whether it’s multitasking that causes the ACC to shrink.

But some experts say technology has made us all more skilful at multitasking. Hong Kong researchers report multitaskers are better at multisensory integration, while a 2016 study from Microsoft found our ability to multitask has “improved drastically” since the turn of the millennium.
(Summarized from BBC Earth Magazine (Vol.9 Issue 1), page 33 by Jo Carlowe)

Verdict: Technology may make us more adept multitaskers, but perhaps at a cost
Lord, Give Us Today Our Daily Idea(s)

Is Technology Changing Our Brains #6 Deep Reading vs. Skim, Scan and Click!


When we read, we construct a mental representation of the text in our minds, much as we do when we look at terrain and create a mental map in our heads. But experts warn that we read text on screens differently, preferring to skim, scan and click hyperlinks, rather than ‘deep reading’ in the old-fashioned sense.

Norwegian experts tested the theory by dividing students of comparable reading skills into a paper group and LCD monitor group. In a follow-up reading comprehension test, the group who’d read texts on computers performed a little worse than the traditional readers. And a Swedish study in which volunteers completed a reading test reported similar findings: those who took the test on a computer scored lower, and reported higher stress levels, than those who took the same test on paper.

Prof Ziming Liu, of the School of Information at San Jose State University in California, believes digital screen readers engage in greater use of shortcuts such as browsing for keywords. His research also reveals that screen users are more likely to read a document only once and expand less time on in-depth reading.
(Summarized from BBC Earth Magazine (Vol.9 Issue 1), page 33 by Jo Carlowe)

Verdict: More research needed, but technology may make us less thorough
Lord, Give Us Today Our Daily Idea(s)



Is Technology Changing Our Brains #5 Bedtime and Sleeping Patterns


We now spend more time on our devices than we do sleeping. According to an August 2015 Ofcom survey, we engage in media or communication activities such as texting or gaming, for 8 hours and 41 minutes daily, and sleep for 8 hours and 21 minutes.

Technology keeps us up for two reasons. First, we are stimulated by the content. Second, the LED screen emits blue light, which prevents the brain from producing the sleep hormone melatonin. The blue light is in a bandwidth one sees in everyday sunlight, explains health education expert Dr Aric Sigman. “The blue light from your phone or tablet informs your pineal gland that its morning and it should shut down production of melatonin.”

In the journal Preventive Medicine (2016) researchers found a strong association between social media use and sleep disturbance, and warned of a link between sleep deprivation and depression. Sleep deprivation has also been associated with obesity and poor academic performance.
(Summarized from BBC Earth Magazine (Vol.9 Issue 1), page 32 by Jo Carlowe)

Verdict: Screen use at bedtime will change your sleeping patterns
Lord, Give Us Today Our Daily Idea(s)

Is Technology Changing Our Brains #4 Hello, Social Skills Bro


Throughout the world – in caves, huts and houses – it was almost a reflex to turn your face to a returning parents,” explains health education expert Dr Aric Sigman. But, he says, kids are now so glued to their screens they no longer look up.

Though some parents might be glad of the respite that screen-time provides, research suggests that excessive screen use seems to damage our ability to interpret faces. “They [excessive internet users] find it more difficult to read faces in experiments,” explains Sigman.

In one study, children showed a significant improvement in reading facial emotions after spending 5 days away from all devices. In another experiment, Chinese psychologists scanned the brains of ‘normal’ versus ‘excessive’ internet users, while they viewed images of faces and objects. The internet junkies showered smaller brain wave responses to faces than their peers.

Sigman’s view is that technology use itself isn’t damaging – just like sweets, it’s simply a case of ensuring children don’t consume too much, too often. Prof Mizuko Ito of the University of California, meanwhile, believes that a reasonable serving of new media can actually be beneficial for the development of youngster’s brains.

Young people who are taking advantage of online tools like search, forums, open educational resources and complex games are learning at a more accelerated rate, and in specialties that they would never have had access to in earlier eras,” she argues.

However, she adds that for disengaged kids in distressed circumstances, digital media can be a distraction from positive learning and social engagement. “It’s not the availability of media that determines this, but whether they have life opportunities, positive peer influences and caring adults who support and guide them to positive media engagements.”
(Summarized from BBC Earth Magazine (Vol.9 Issue 1), page 32 by Jo Carlowe)

Verdict: New media is just a place to ‘hang out’, but for the socially disengaged there are risks
Lord, Give Us Today Our Daily Idea(s)

Is Technology Changing Our Brains #3 Bad Memory, Maybe


With phone numbers, routes and facts just a touch away, we’re becoming less reliant on our memory – and German neuroscientist Manfred Spitzer warns this ‘cognitive offloading’ could be leading to a kind of ‘digital dementia.’

Studies on internet and gaming addicts has uncovered atrophy (shirking) in the brain’s grey matter, says the University of Bedfordshire’s Prof James Barnes. Overdosing on technology seems to cause the frontal lobe – a brain area that governs functions such as planning and organizing – to suffer in particular. However, he adds that more research is needed on ‘real’ as opposed to ‘addicted’ internet users.

Digital offloading may also make memories less vivid. A US study asked museum visitors to photograph exhibits and just look at others. The next day their memory was tested. Visitors were worse at recognizing objects they had photographed, and worse at recalling details about the objects they’d photographed.

But Dr Sam Gilbert, of University College London, says there are also positives. “Research shows that when you save information to an external store like a computer, this can help you to store new memories. Your mind is no longer cluttered with information that you don’t need.
(Summarized from BBC Earth Magazine (Vol.9 Issue 1), page 31 by Jo Carlowe)

Verdict: Short-term changes are likely but more research is needed on long-term impact
Lord, Give Us Today Our Daily Idea(s)

Is Technology Changing Our Brains #2 Moody Media-Users


Scientists have been reporting strong links between heavy internet use and depression, with a particular focus on social media. This came as no surprise to health education expert Dr Aric Sigman, who says high exposure to social media can leave people feeling inadequate. “There is a relationship between the amount of time you spend on social media and increased body dissatisfaction. High consumption of idealized images seems to activate neural networks in the brain like the amygdala, associated with fear and anxiety.”

Sigman cited a study in which girls who instant messaged their mothers released the stress hormone cortisol, rather than the feel-good hormone oxytocin associated with face-to-face interaction. “We may be hard-wired to need a certain amount of contact with people we care about. A deficit in human contact may result in health problems.”

Facebook, it seems, might not be giving us enough facetime.
(Summarized from BBC Earth Magazine (Vol.9 Issue 1), page 31 by Jo Carlowe)

Verdict: Technology can affect mood, but it depends how we use it
Lord, Give Us Today Our Daily Idea(s)


Is Technology Changing Our Brains #1 May I Have Your Attention Please!


Phones buzzing with text messages, Facebook notifications and news alerts continually tempt the world to distraction. Many experts believe that this incessant bombardment, and the need for instant answers, has eroded our ability to focus.

A 2015 study by Microsoft surveyed 2,000 Canadians and used electroencephalograms (EEGs) to watch the brain activity of a further 112 people. Their analysis found that the average human attention span had dropped from 12 seconds in the year 2000 to just eight seconds. Goldfish are thought to possess an impressive nine-second attention span.

This wasn’t just a company chasing a catchy headline. The research in the area is mostly anecdotal, but a number of surveys do back up the idea that attention spans are shrinking. In a 2012 Pew Research Center survey of more than 2,000 teachers in the US and Puerto Rico, 87% reported that their students had short attention spans and were easily distracted. The same year, UK poll from the learning company Pearson reached the same conclusion. Of 400 UK English teachers questioned, and 2,000 parents of preschool and primary-aged children, 7 out of 10 said that children’s attention spans were shorter than they used to be.

Meanwhile in the US, the Centers for Decease Control and Prevention has reported that 11% of school-age children have, at some point, been diagnosed with Attention Deficit Hyperactivity Disorder (ADHD). Before 1990, the figure was less than 5%.

These studies shine a spotlight on our diminishing attention spans, with modern technology in the crosshairs as the culprit. More research is needed if we’re to be sure of a causal relationship, but experts feel certain they’ll find one. “I am personally convinced that technology has led to a decreased ability to focus and wait, and an increased need for immediate information,” says neuroscientist Prof Russell Poldrack, of Stanford University.
(Summarized from BBC Earth Magazine (Vol.9 Issue 1), page 30 by Jo Carlowe)

Verdict: Yes, the information age has shortened our attention span
Lord, Give Us Today Our Daily Idea(s)

Tuesday, April 18, 2017

Essential Thinkers #22 Albert Einstein, Genius of 'Special' and General Relativity


I posts this article today, 18th April 2017. It is exactly (plus minus leap year) 62 years since the death of Albert Einstein on 18th April 1955. I can’t wait to watch National Geographic Series entitled Genius about the life of Einstein! (premier 25th April 2017)

German-born physicist of Swiss parentage, Albert Einstein (1879-1955) became a naturalised American in 1935, after leaving Hitler’s Germany to avoid persecution as a Jew. After an unpromising start to his academic career, at one time declaring, “I have given up the ambition ever to get to a university,” he accepted a job in Bern patent office, where he conceived the theories of general and special relativity which were to found modern physics. Einstein was also politically active, both in the cause of world peace and Zionism. In 1952 he was offered the presidency of Israel but declined, claiming he was too na├»ve in politics. On the relation between his scientific and political interests he once said, “Equations are more important to me, because politics is for the present, but an equation is something for eternity.”

The philosophical import of Einstein’s work is enormous. His theory of relativity assigns an unprecedented importance to the role of the observer in his description of the physical world, threatening the received notions of space and time, as found in Isaac Newton, John Locke, Immanuel Kant and others. The central aspect of Einstein’s works is that the speed of light is constant. It gives rise to the two most famous ideas of relativity physics: the equivalence of mass and energy expressed in the equation E = mc2 (where E = energy, m = mass and c = the constant speed of light), and the law that nothing can travel faster than the speed of light.

These have at least two philosophically important consequences. First, it follows from relativity that one cannot speak of an event occurring at precisely the same time for different observers. Each observer’s time frame is relative to himself. Imagine an observatory in Jupiter looking at an observatory on Earth. In each an astronomer looks through his eyeglass at the other at, we might suppose, exactly the same time. Since light takes 35 minutes to travel from Jupiter to Earth, the event on Jupiter in which the astronomer looks through his telescope must have taken place 35 minutes before the astronomer on Earth observes the event. Equally, the same applies to the astronomer on Jupiter: as he observes the astronomer on Earth he is observing an event that took place 35 minutes prior to his own time frame. It is tempting to think there is some absolute position in which the two events could be observed as simultaneous, but this is exactly the possibility ruled out by relativity theory. Space and time are not independent dimensions, but form a four-dimensional unity, space-time, in which every event can only be recorded relative to a local time-frame.

The second philosophically interesting consequence of relativity is that although the speed of light is constant, its frequency (the number of waves of light per second) varies closer to massive objects like planets. This means time appears to run slower near a massive body than farther away. In 1962 physicists confirmed this prediction by using two very accurate clocks, one at the base and one at the top of a water tower. The clock at the base was found to run slower than the other.

This gives rise to the famous ‘twin paradox.’ Suppose one twin goes for a lengthy journey into space while the other stays on Earth. When he returns he would appear to be much younger than his twin. The paradox arises from the assumption of an absolute time frame. The relativity thesis means that each body carries around its own personal time scale which does not, in general, agree with the time scale of other entities. Relative to each other, 50 years near a massive gravitational body is a shorter duration than 50 years far away from massive body. Thus while 50 years might have passed on Earth the space travelling twin might find he has only been away in space for 35 years. The exact difference depends on the gravitational influences on the two twins throughout their lives.

The philosophical consequences of Einstein’s relativity theory, like the empirical consequences, are yet to be fully known. Issues about time-travel, the passage or ‘flow’ of time, the asymmetry between past and future and between cause and effect, are all issues that require an understanding of Einstein’s momentous work.
[Summarized from Philosophy 100 Essential Thinkers by Philip Stokes, 2012.]


Lord, Give Us Today Our Daily Idea(s)

Friday, April 14, 2017

Essential Thinkers #21 Benedict de Spinoza: Mystical, Rational and Theistic


Dutch philosopher of Jewish origin, Benedict de Spinoza (1632-1677) remain one of the most compelling if not the difficult philosophers of the Rationalist school. Greatly influenced by Rene Descartes and Euclid, he takes rationalism to its logical extremes, determining to set out the principles of an ethical system in axiomatic format, mush as Euclid proved his theorems of geometry. Spinoza’s ambitious project is perhaps one of the greatest ever undertaken in philosophy and it is a mark of his greatness that, to a considerable extent, he was remarkably successful in this undertaking.

In the posthumously published Ethica ordine geometrico demonstrate (Ethics demonstrated in geometrical order), Spinoza sets out the axioms which he takes to be self-evident and then proceeds, step by step, to deduce ethical conclusions. Like Descartes, he is concerned to set knowledge on logical foundations: his ethical conclusions must therefore first be founded on a number of ontological, metaphysical and epistemic beliefs. Each of these is, in turn, demonstrated in geometric fashion.

Central to Spinoza’s philosophy is the idea, similar to that of Parmenides of Elea, that everything in the universe is One. There is only one substance and that substance we can conceive as of either Nature or God. This substance has infinitely many attributes but human beings, being finite, can only perceive two of them, extension and thought. Unlike Descartes, who thought mind and body were two separate kinds of thing, Spinoza argues that mind and body are just different ways of conceiving the same reality.

This reality, Nature of God, is wholly self-contained, self-causing and self-sufficient. Everything in the universe is part of God, and everything that happens is a necessary part of expression of the divine nature. The upshot of this pantheistic view is to remove free will from the realm of human actions. After all, if human beings are part of the divine reality there is no room for independent causal actions. Spinoza is more than happy with this conclusion, he is a thorough-going determinist: “Experience tells us clearly that men believe themselves to be free simply because they are conscious of their actions and unconscious of the causes whereby these actions are determined; further, it is plain that the dictates of the mind are simply another name for the appetites that vary according to the varying state of the body.”

Nevertheless, Spinoza does find a way of making room for a kind of freedom, though it is not of the sort that philosophers are used to. Each individual, says Spinoza, is a localised concentration of the attributes of reality, really a quasi-individual, since the only true individual is the universe in totality. Insofar as the quasi-individual is ruled by his emotions, he is unfree and at the mercy of finite understanding. To become free, the individual must, by means of rational reflection, understand the extended causal chain that links everything as one. To become aware of the totality of the universe is to be freed, not from causal determinism, but from an ignorance of one’s true nature.

What then, of wickedness, sin and evil?

Since everything is part of one reality there is no such thing as evil from the viewpoint of the whole – “sub specie aeternitis” (from the aspect of eternity). That which appears evil does so only because we lack the understanding to see the bigger picture, the chain of causes that make all events a necessary part of divine reality. Though many were shocked by this in Spinoza’s day, it reflects the same sentiment expressed by some Christians who persevere in the face of adversity by claiming that “God moves in mysterious ways” and “ours is not to reason why.” Of course, for Spinoza, to reason why is exactly what we must do to attain freedom.

Interestingly, Spinoza’s philosophy is both mystical, rational and theistic. Yet he was excommunicated from the Jewish community for his views, denounced as an atheist by Christians and declared so wicked that at one time his books were publicly burnt. Despite the rigour and integrity of his work, Spinoza remains one of the lesser studied and least regarded of all the rationalist philosophers.
[Summarized from Philosophy 100 Essential Thinkers by Philip Stokes, 2012.]

Lord, Give Us Today Our Daily Idea(s)

Wednesday, April 5, 2017

As A Man Thinketh #5: No Victimized Mindset, Take Responsibility


A person is buffeted by circumstances so long as he believes himself to be the creature of outside conditions
(James Allen, As A Man Thinketh)

One of the great weaknesses of our society today is the growing attitude of victimization. Many people claim themselves to be victims of some outside force. “I don’t know the story of Bible because my pastor doesn’t teach me…”; “If that driver hadn’t pulled out in front of me…”; “I am like this because of my parents…”

When we are victims of circumstances, or as James Allen says, a “creature of outside conditions,” we have no power. We have given over the power in our life to the circumstances. The longer we give power to our circumstances the worst our circumstances become. In his other book, Above Life’s Turmoil, Allen writes, “You imagine your circumstances as being separate from yourself, but they are intimately related to your thought world. Nothing appears without an adequate cause.”

To get control of our circumstances we must first acknowledge personal responsibility for being where we are. That was the hardest part for me because the ‘victim’ in all of us doesn’t want to take that responsibility.

When we take responsibility we must then take control of our thoughts. And, yes, in the beginning that can be hard. It seems sometimes that it’s our nature to first think negatively. But that’s just because it’s the habit we’ve developed. And like any habit, it can changed by replacing it with the habit of thinking the right way.

Emmet Fox once writes: “You are not happy because you are well. You are well because you are happy. You are not depressed because the trouble has come to you, but trouble has come because you are depressed. You can change your thoughts and feelings, and then the outer things will change to correspond, and indeed there is no other way of working.

Think about it!

Lord, Give Us Today Our Daily Idea(s)

Tuesday, March 7, 2017

Essential Thinkers #20 Rene Descartes "I Think, Therefore I Am"


Cogito, ergo sum” (I think, therefore I am)

French philosopher and mathematician, Rene Descartes is often called the father of modern philosophy. Known to physicists as the discoverer of the law of refraction in optics, but Descartes’s most famous work is in philosophy. Meditations of First Philosophy set the agenda for speculation in the philosophy of mind and epistemology for at least the next 300 years. He raised problems of such radical scepticism about our knowledge of the world that he suggests the only thing one can be absolutely certain of is the fact of one’s own existence, an insight summed up in his famous edict “cogito, ergo sum,” popularly translated as “I think, therefore I am.”

Descartes’ program in the Meditations is to put the edifice of human knowledge upon secure foundations. Reviewing his beliefs, he finds that many are contrary. Some are more or less justified than others; some, such as the propositions of mathematics, seem certain; others readily turn out to be false. He resolves to put some kind of order into this jumble of beliefs so that justification from one proposition may follow from another. In order to do that he needs to begin with whatever is most certain and infallible. The question is, where to start?

Descartes comes up with an ingenious program. Rather than attempt to examine and order each belief in turn – as task impossible to contemplate – he decides to examine his beliefs against a method of doubt. The method of doubt consists in questioning the source of his beliefs and asking whether that source is infallible. If not, he can be sure that any belief from that source cannot be relied upon to provide the foundations of knowledge.

To begin with, Descartes notes that many of his beliefs are derived from his senses, or from perception. He notes that the senses, however, can often mislead. A stick may look bent when viewed half submerged in water, the true size of the sun and the moon is many times greater than would appear from sight, and so on. One can even suffer hallucinations such that what one thinks to be there does not exist at all. Descartes resolves not to trust completely that which has deceived him once, and therefore rejects any information from the senses as being uncertain and fallible.

Even so, one might think that although the senses may deceive from time to time, Descartes can be sure, at least, that he is sitting in his study, or is a Frenchman with an interest in philosophy and so on. Be he recognizes that there is no clear and distinct way of telling the difference between reality and dreaming. How does he know that the life he thinks he is leading is not just part of a dream? There are no clear ways of distinguishing between waking life and a life merely dreamt.

So, rejecting all perceptual knowledge, Descartes turns to what he believes on account of his own internal reflections. Surely he knows that 2 + 3 = 5, that a mother is older than her daughter, that a triangle has three sides? But it could be the case, reflects Descartes, that he is the subject of a massive deception. Now Descartes imagines a scenario wherein he might be deceived by a divinely powerful, but malignant being; a demonic being that could manipulate his thoughts, as God might if he were not supremely good, into thinking anything the demon might choose.

This idea of wholesale radical deception has been the subject of popular films such as The Matrix and Twelve Monkeys. Descartes realises, however, that there is one proposition that neither the evil demon nor even God could make false. This is that at any time when he thinks, it must be the case that he exists. For he must exist in order to be able to think. By such reasoning Descartes is led to the cogito as the one certain, infallible rock of knowledge.

For Descartes, the cogito was the beginning of a project in which he attempted to prove the existence of God, in order to guarantee the rest of human knowledge. His commentators, unimpressed by his weak version of Anslem’s ontological argument or his own “trademark argument” to prove the existence of God, have taken the Meditations to be the definitive work of epistemological scepticism.
[Summarized from Philosophy 100 Essential Thinkers by Philip Stokes, 2012.]

Lord, Give Us Today Our Daily Idea(s)

Monday, February 6, 2017

As A Man Thinketh #4 The Most Basic and Logical Principle


Good thoughts and actions can never produce bad results; bad thoughts and actions can never produce good results. This is but saying that nothing can come from corn but corn, nothing from nettles but nettles
(James Allen, As A Man Thinketh)

Most everyone understands the biblical concept of sowing and reaping because we can grasp the simplicity of logic. If we were to plant durian in our farm we wouldn’t expect apple to come up. But even though we can grasp the logic, we don’t always act as if we understand the power of this principle. And we certainly don’t act as if this principle will affect us.

An example: For many years my morning ritual began with video games (or PSP to be exact). Most mornings spending an hour or more on games and morning news before dashing off the office. I wasn’t realize then that our minds are most impressionable immediately upon rising in the morning and just before sleep in the evening. It shouldn’t have come as a surprise to me that my sowing of these thoughts would reap an ‘attitude’ at my workplace (impatient, demanding, shouting, etc.).

I gave up my morning ritual seven years ago and replaced it with a habit of reading. I read my Bible or book of the week and on the way to work I listened to motivational or self-development audiobook. When I sow “good thoughts” and thus I’ll reap “good results.” The Apostle Paul wrote, “You’ll do best by filling your minds and meditating on things true, noble, reputable, authentic, compelling, gracious—the best, not the worst; the beautiful, not the ugly; things to praise, not things to curse” (Philippians 4:8, The Message).

We always reap what we sow and that is especially true with our thoughts. As Emmet Fox writes, “The secret of life then is to control your mental states, for if you will do this the rest will follow. To accept sickness, trouble, and failure as unavoidable, and perhaps inevitable, is folly, because it is this very acceptance by you that keeps these evils in existence. Man is not limited by his environment. He creates his environments by his beliefs and feelings. To suppose otherwise is like thinking that the tail can wag the dog.”

Think about it!

Lord, Give Us Today Our Daily Idea(s)


Saturday, January 28, 2017

I Wonder #9 Why Do Introverts Find Social Situations So Tiring?


Apart from the simple fact that introverts prefer to have plenty of time to themselves, and that being denied this can be draining, there’s also evidence that, at a physiological level, introverts response more strongly to stimulation, such as loud noises, than extroverts do. This means socialising is more likely to leave them exhausted and needing a rest afterwards. On the introvert, Dear blog, contributor Shawna Courter even argues that there’s such a thing as an “introvert hangover,” which she describes as “an actual physical reaction to [social] stimulation. Your ears might ring, your eyes start to blur, and you feel like you’re going to hyperventilate.”*


Lord, Give Us Today Our Daily Idea(s)

*Taken from BBC Earth Vol 9, Issue 1. Page 82 by Dr Christian Jarrett

Tuesday, January 24, 2017

Essential Thinkers #19 Isaac Newton, the Man Who Discovered Gravity


This most beautiful system of the sun, planets and comets, could only proceed from the counsel and dominion of an intelligent and powerful Being.”
(Isaac Newton)

A mathematician and physicist, Sir Isaac Newton (1642-1727) produced work – philosophical to a degree – which served mainly as an impetus and basis for many of the philosophers of his and succeeding generations, including John Locke and Immanuel Kant, who both owed much to him. Newton’s principal work, the Philosophiae Naturalis Principia Mathematica contains his theory of gravity and laws of motion. His later work, the Opticks, is primarily concerned with optical physics but also contains speculations on mechanics, religion and morals. He was to be involved in a series of disagreements with Gottfried von Leibniz, initially over which of them was the first to invent the calculus, and later over the issue of the status of space and time.

The insight behind Newton’s physics was that the universe runs according to law-governed mechanical principles. This idea was to have a profound influence on John Locke, whose philosophy may be seen as the philosophical working out of Newton’s physical principles. Locke was determined to make sense of human understanding in a way consistent with Newtonian mechanics. As a result, he argued for a causal theory of perception and for a distinction between primary and secondary qualities of objects.

Emmanuel Kant, in similar fashion, recognized that everything in the phenomenal world had to conform to Newton’s principles, but that this order was for the most part imposed by the psychological apparatus of the mind. Kant’s philosophy gave support to Newton in the quarrel with Leibniz over whether space and time should be conceived of as absolute or merely as relations between objects. The debate seemed to have been won hands down by the Newtonians until the advent of Einstein’s relativistic physics.

Claiming that his method was empirical and inductive, rather than rationalist and deductive, Newton was also fond of criticizing Rene Descartes. It is thanks to Newton that empiricism began to enjoy a period of dominance over rationalist philosophy. However, Newton owed much to Descartes’ thought, and it is likely his own speculations could not have begun but for the work already undertaken by his rationalist predecessor.

Undoubtedly, Newton’s greatest achievement was his theory of gravity, from which he was able to explain the motions of all the planets, including the moon. Newton proved that every planet in the solar system at all times accelerates towards the sun. The acceleration of a body toward the sun is at a rate inversely proportional to the square of its distance from it. This led to Newton’s law of universal gravity: “every body attracts every other with a force directly proportional to the product of their masses and inversely proportional to the square of the distance between them.” The law of universal gravity allowed Newton to predict all of the planetary motions, the tides, the movements of the moon and of the comets.

It was a striking achievement that would not be superseded until Albert Einstein, although even with the advent of Einsteinian relativity, Newton’s mechanics still holds good – and indeed is still used, on account of its simplicity, for predicting the movement of so-called ‘medium-sized’ objects – anything that is neither bigger than the solar system nor smaller than the eye can see. Newton’s work is profound and remarkable achievement in the history of human thought.
[Summarized from Philosophy 100 Essential Thinkers by Philip Stokes, 2012.]

Lord, Give Us Today Our Daily Idea(s)

Top 10 Most Read Idea(s) Last 7 Days

Idea-Labels

Thinking Creative Question Action Change Your Life Essential Thinker Series Focus Positive Secrets of the Millionaire Mind Harv Eker Success Attitude Choice Learning Nurture Creativity Play Mindset Perspective Time Experience Habit Observation Curious Different Failure Hardworking How-to Generate Ideas Imagination Problem-Solving Wealth 12 Rules for Life Children Inspiration Jordan Peterson Relax Rich Break the Rules Change Perseverance Reading Risk-Taker Seeing Albert Einstein Barriers to Creativity Confidence I Wonder Series People The Subtle Art of Not Giving A F*ck (Mark Hanson) Connection Happiness Money Possibilities The 7 Habits of Highly Effective People Thought With Winning In Mind Asking Books Character Characteristics of Creative Person Is Technology Changing Our Brains Knowledge Practical Process Writing Believe Challenge Childlike Criticism How to Choose Optimism How to Nurture Your Child to Be Creative Innovative Listening Purpose Relationship Responsibility Story of Idea Thomas Edison Value 7 Climate Facts You Need to Know Communicate Control Enjoy Freedom Fun Idea-Quote Meaning Mistakes Open Mind Opportunity Optimistic Original Resourceful Roger von Oech Talent As A Man Thinketh Combination Commitment Discovery Don't Give Up Dream Energetic Environment Friendship Genius Give Up Growth Leonardo da Vinci Picture Playground Quiet Space Random Reason Start With Why (Simon Sinek) Steve Jobs Understand Walk Wisdom Yew Kam Keong Ability Ambiguity Behavior Crazy Daydreaming Decision-Making Example Facts about Creativity Faith Fear Feeling Goal Hearing Humour Improvement Independence Intuition Isaac Newton Lead Love Motivated Nature Non-Conformist Passion Potential Respect Savor Life Self-Image Stephen R. Covey The Power of Habit Word Alternative Application Awareness Common Blocks to Creativity Conversation Discipline Dynamic Emotion Encouragement Expectation Feedback Flexibility Idealistic Jack Foster Leader Logic Mindful Music Negative Performance Persistence Physical Reinforcement Result Right Answer Sixth Sense Society Talking The Human Body Tony Buzan Vincent Ryan Ruggiero Vision Adventurous Appreciate Attention Be Yourself Beautiful Christopher Columbus Conscious Daring Desire Edward de Bono Empathy Excuses Exercise Financial Galileo Goodness Hardship Help Henry Ford How to Be Innovative Humble IQ Jesus Kindness Laugh Let's Get Started! Memory Mental Rehearsal Michael J. Gelb Multitasking Nicolaus Copernicus Patient Pen and Paper Planning Power Praise Prejudice Proactive Progressive Quality Reality Recording Rejection Routine Sharing Simplicity Sleep Social Media Stand Firm Starbuck Stimulate Strength Stress Studying The Internet Theology Think like A Fool Touching Unpopular Usefulness Victor Hugo What If Win-Win Zig Ziglar 6 Common Creative Killers 9 Types of Intelligence A. Samad Said Affirmation Alexander the Great Aristotle Association Assumption Austin Kleon Balance Benedict de Spinoza Benjamin Franklin Bette Nesmith Graham Bill Gates Blessing Brainstorming Business Carpe Diem Chaos and Order Cicero Colonel Sanders Compliance Concentration Contribute Copernicus David Hume Descartes Desiderius Erasmus Development Diversity Don't Try Download Drug Elaboration Eleanor Roosevelt Enthusiasm Error Ethics Eurika Experiment Explore Extrovert Fluency Francis Bacon Free Book Generalist Giving Back Heroes Hopeful Hormones How to Spark Your Creative Mind How-to Maximizing a New Idea Howard Schultz Hunting Illustration Information Integrity Intention Interruption Introvert Investment James Webb Young Jason Mraz Jean-Jacques Rouseau Jim Carrey Jogging John Locke Jurgen Wolff Juxtapositions Legacy Leon Ann Mean Leon Trotsky Light Liquid Paper Machiavelli Management Manifestation Manipulation Marcus Aurelius Mark Zurkerberg Martin Luther Marty Neumeier Maturity Mental Mihaly Csikszentmihalyi Mind Maping Miracles Mission Statement Modeling Money Blueprint Mood Move On My Top 17 Book on Innovative and Creativity Lists Navigation Skills Niccolo Machiavelli Offline Ontology Ordinary Pablo Picasso Pain Paracelsus Paradigm Paradox of Creative People Parenting Passive Income Peace Perception Philosophy Plato Political Practice Priority Privacy Procrastination Productivity Promote Pythagoras of Samos Rational Rebellious Receiving Reformer Rene Descartes Resilience Resource Myopia Rest Reverse Robert Korn Running Safe Saving Say No Scientific Method Scott Belsky Self-Gratification Selling Seneca Skeptic Slow Down Smelling Social Skills Socrates of Athens Soichiro Honda Specialist Spider-Man St Anselm St Augustine of Hippo St Thomas Aquinas Steal like An Artist Stubborn Suffering Synergize Tasting Technology Thales of Miletus The Creative Environment The Empiricist The Mozart Effect Thomas More Tok Nan Toy Tradition Truth Uniformity Unique Universe Unorthodox Volunteer Walt Disney Wildlife Wonder Xenophanes of Colophon