Archive

Archive for the ‘Management Research’ Category

Part 2: Epistemological Grounding of Thesis

December 27, 2009 Leave a comment

In line with all scientific research, this study adopts certain philosophical assumptions. Four sets of assumption about the nature of social science will be considered (Burrell & Morgan, 1979). At an ontological level, this research leans towards a realism approach, which believes in an external world that consists of structures that can be examined and understood through empirical research. In common with realism, the author, and therefore this research, is concerned with factors that lead to particular outcomes and tend to avoid generality by accepting temporal, spatial and organisational uniqueness. At an epistemological level, this research adopts an intermediate position between positivism and anti-positivism. This is because the author believes knowledge is a cumulative process (positivism) but rejects the (positivist) notion that a set of generalised laws can be developed to explain the social world.

As for the human nature debate, the research settles for a compromise between voluntarism and determinism. Implicit in this belief is a view that social science is too young to understand and untangle the complex nature of isolating the voluntary and situational factors that affect human activities (i.e. are humans determined by their environment or do they have free will?). Given our present understanding and knowledge (or the lack of it), which as previously set forth (in this research) is biased towards realism, it is therefore prudent to assume that humans do exercise ‘free will’ but at the same time we are, to a certain extent at least, a product of our environment. If, however, advancement in sociology provides new evidence regarding the influence of ‘environment’ over the autonomy of humans, the author would be happy to realign his position on the voluntarism-determinism scale. This is a difficult topic because at one extreme end voluntarism implies that humans are completely autonomous and our decisions are never affected by the environment or situation we are in; at the other extreme, determinism summits that all human actions are determined and can be explained by our environment and we do not have ‘free will’.

One important argument, which studies that emphasised a voluntaristic view have drawn upon to defend their position on human nature (Pasanen, 2003), relates to the intentionality of humans. For example, voluntaristic theorists argue that human beings can set future goals and objectives to make their present behaviour understandable, yet at the same time an entrepreneur may choose to pursue goals that are not economically rational (i.e. altruistic goals instead of profit maximising). Deterministic theorists will argue the keyword here is choice. Did the entrepreneur really choose altruism or did his upbringing and (therefore environment) play a role? Can his choice be explained by her experiences? Perhaps he was born altruistic with genes that predispose him to such behaviour? In that case, did he really make the choice of setting philanthropic goals for his company or was his choice determined by his experiences/environment?

While it may be difficult for some readers to submit that one’s conscious thoughts are not within the realms of oneself, it is worthwhile to note that a conscious belief of having a ‘choice’ (free will) does not necessarily equate to having one.

Theorists from both camps have yet to determine the line of demarcation between what constitutes autonomous and heteronomous behaviour, therefore this research adopts an intermediate position that considers both situational and voluntary factors when accounting for the activities of an entrepreneur/SME manager.

Lastly, in terms of methodological dimension, this research has a stronger preference for ideographic theory over nomothetic theory. As previously discussed, this research is concerned with factors that lead to a particular outcome (realism) and avoids generality; therefore, in line with that assumption, the author also shies away from nomothetism, which relies more on the scientific method of hypothesis testing (such as quantitative surveys and standardised research tools). The ideographic approaches to social science emphasise the analysis of subjective accounts that one generates by ‘getting inside’ a subject and exploring their background and history. For example, a study of outliers (from scientific and music geniuses to business luminaries and sport stars) ascribes individual success to a sequence of events that occur throughout the subjects’ lives (Gladwell, 2008). Success, according to Gladwell, can be explained by a series of cumulative factors that can be pinpointed to certain historical and biographical occurrences unique to the subjects.

Conventional wisdom would argue the idiographic approach lacks predictive power. It is descriptive/analytical rather than predictive. At a postgraduate or doctoral level, research should always be analytical or predictive as opposed to exploratory or descriptive for undergraduate level (Collis & Hussey, 2009). Such criticisms often originate from researchers of hard sciences who embrace the nomothetic methodology characterised by quantitative techniques. Yet often, such criticisms are also unwarranted, as the complexity and subjectivity of social sciences do not allow for the luxury of statistical significance and confidence intervals that bode well for researchers of natural sciences. As ‘success studies’ have shown, attempts to predict ‘winners’ in the field of management research have failed miserably (in terms of their predictive powers).  For all the criticisms levelled at the ideographic approach and its lack of predictive power, the nomothetic approach and its extrapolative powers have not fared any better in terms of addressing the holy grail of business management – the quest for sustained performance.

While sympathetic to the assumptions of a nomothetic approach, this research, as previously mentioned, rejects the notion that a set of generalised laws can be developed to explain the social world. Accordingly, it adopts the ideographic view that one can understand the social world by ‘getting inside’ and gaining first-hand knowledge of the subject.

In line with the author’s position of straddling both positivism[1] and anti-positivism[2] camps[3], this research also follows Kuhn’s recommendations on theory choice described in his seminal work The Structure of Scientific Revolutions (Kuhn, 1970). According to Kuhn, the five characteristics of any good theory should have: 1) Accuracy – empirically adequate with observations and results, 2) Consistency – internally and externally consistent with other theories, 3) Scope – the theory should be able to explain beyond what it was designed for, 4) Simple – Law of parsimony[4], the simpler of two competing theories is preferred, and 5) Fruitful – discloses new phenomena for research.

Although Kuhn’s work has been famously criticised by Popper, who promotes empirical falsification over inductivism, the author maintains that both verificationists and falsificationsists are essential to the growth of knowledge (in which we previously adopt the position that knowledge is cumulative). The accumulation of knowledge can only happen with new insights added to the stock of knowledge and false hypotheses eliminated (Burrell & Morgan, 1979).

Finally, as a realist, the author maintains that all beliefs are an approximation of reality and every new observation brings us closer to understanding reality (Blackburn, 2005).


[1] Positivism is generally a form of deductive research commonly characterised by the use of (scientific) statistical technique.

[2] Anti-positivism is the view that social sciences should develop different (non) scientific methods from those used in hard sciences.

[3] Kuhn’s work has been accused of blurring the distinction between scientific and non-scientific methods by the likes of Karl Popper, which ironically fits this research’s intermediate position of embracing both positivism and anti-positivism.

[4] Occam Razor is a principle that states: between two competing theories that make exactly the same prediction, the simpler one is preferred.

 

Advertisements

The Use of Social Media in Academia (or the lack of it)

November 5, 2009 Leave a comment

Researchers have a lot to learn from Web 2.0 enabled Businesses

As an early adopter of technology, I am always one of the early users of new tech products and platforms such as Facebook, Twitter, Skype, Gmail, ICQ (yes I know), IRC (don’t even get me started). However, I never really appreciated the power and extent of social media. I even deactivated my Facebook account at one point, but that is another story. I recently gave a keynote speech in the Outdoor Advertising Forum at the SIM Expo in Abu Dhabi. The Social Media Forum, which was a parallel track, was taking place one day earlier. I had the pleasure of listening to speakers including some heavy hitters in the social media world. Amongst them are: Andrew Bleeker The New Media Director, President Obama Inaugural Committee and Director of Internet Advertising, Obama For America, USA, the Managing Director of Myspace-Germany, Google and Yahoo! executives. It was enlightening! Needless to say, I now see the benefits of social media in a totally new light and in multiple perspectives (and I reactivated my Facebook account).

Many of us take our modern technology for granted, little realising just how much we depend on it. Yet we do depend on it. While we may not all be eternally checking Facebook for the latest photo or tweeting what we are wearing, we are certainly all able to take advantage of Web 2.0 technologies. For example, checking the Internet for cheap holidays, car insurance or even ordering our shopping have become wholly embraced by modern society; many of us couldn’t even fathom returning to a time without it.

Yet despite this, as David Stuart explores in a ‘Research Information’ article,[1] not all areas of society seem so willing to accept Web 2.0.  The area of scholarly publishing, in particular, seems reluctant to utilise it to its full potential, despite the fact it offers unparalleled potential for existing researchers to show the world their ideas and creations, and younger upstarts the access to an almost infinite wealth of information. The open-data highway allows the full study process to be seen globally, and other researchers can comment upon the results. Yet this group has almost entirely overlooked the tool of Web 2.0, with the exception of the more adventurous providing largely ignored blogs and basic websites that give an overview of a study and its researchers. This is most odd, especially when one considers that videos can be uploaded, daily blogs – and vlogs – can be hosted online and potentially a large gathering of students and the public could form.

It is incredibly perplexing, especially considering that the Internet has developed to such a point as to be useful to everyone. There now exist academic websites with large scientific publishers backing them, such as citeulike.org (sponsored by Springer) and connote.org (part of the Nature Publishing Group). For the researchers who don’t want such credentials, there are social-networking academic sites like academia.edu and myexperiment.org which allow academics to share their research methodology and results with other users. And, of course, there is the fact that the large medical journals like the British Medical Journal are online. Such journals also offer fellow science minds to comment on the work, explaining its strengths and weaknesses and forming a network of scientist contacts. So the issue is clearly not because there is nothing on offer to academics.

Despite the potential, and the existing webpages and journal sites, the response from academics has been largely underwhelming. Web traffic website alexa.com shows that as of yet there is no single academia-based website that is universally popular. Perhaps predictably, the most popular ones are those with the scientific-publisher backing; this is down to the fact that they benefit both teams and solitary researchers. Even then, their added popularity over the others does not mean they are generally popular and in fact they are not ‘big’ websites at all.

This is perhaps not as surprising as we may initially think, though. Certainly, it is not an accidental result. Nature Publishing Group ran a voluntary open peer-review trial and the vast majority opted out; of those that did partake very few comments were received.  Furthermore, institutional repositories have admitted that persuading academics to submit their research papers is incredibly difficult. It is as though they want to exist in an enclosed world, researching for their own purpose and not wanting others to see it. It seems that while all academics want to benefit from open access, they do not want to include their own in the databases.

It is important to realise that not all academics are so reluctant to embrace technology. Indeed, many are staunch advocates and see it as a great tool. But the numbers are few and it is the rejecters that are the clear majority. Perhaps one reason is simply because the traditional research paper is still heavily emphasised; it is the holy grail of academia. Maybe it is this preconceived notion of what it is to be a serious researcher that causes hesitation in the field, the idea that embracing technology would be going against the long-held tradition of research paper and submitting to journals. And while their main argument will be that this has worked well in the past, it is overlooking how times have moved on and just how wide an audience they could have if they succumbed to the Web 2.0 revolution.

But maybe the root cause is simple fear: scholars worry that they will become victims of plagiarism, and they possibly believe that with untold millions of people using the Internet at any given moment they are at extra risk of this happening, and they still think books are more ‘academic’ than the Internet. With this point in mind, the Research Excellence Framework (REF) should place more emphasis on the necessity of webometrics, in conjunction with bibliometrics. This may provide a much-needed push for the academics to embrace the technology, without leaving the current methods behind. It will be a long process, though, and the largest change in attitudes will result from the next generation coming forth from their technological upbringing of social-networking to drag the academia world into the 21st century.


[1] http://www.researchinformation.info/features/feature.php?feature_id=236

 

%d bloggers like this: