• MAY 16, 2024

What Is Empirical Research? Definition, Types & Samples in 2024

Imed Bouchrika, Phd

by Imed Bouchrika, Phd

Co-Founder and Chief Data Scientist

How was the world formed? Are there parallel universes? Why does time move forward but never in reverse? These are longstanding questions that have yet to receive definitive answers up to now.

In research, these are called empirical questions, which ask about how the world is, how the world works, etc. Such questions are addressed by a corresponding type of study—called empirical research or the empirical method—which is concerned with actual events and phenomena.

What is an empirical study? Research is empirical if it seeks to find a general story or explanation, one that applies to various cases and across time. The empirical approach functions to create new knowledge about the way the world actually works. This article discusses the empirical research definition, concepts, types, processes, and other important aspects of this method. It also tackles the importance of identifying evidence in research .

I. What is Empirical Research?

A. definitions.

What is empirical evidence? Empirical research is defined as any study whose conclusions are exclusively derived from concrete, verifiable evidence. The term empirical basically means that it is guided by scientific experimentation and/or evidence. Likewise, a study is empirical when it uses real-world evidence in investigating its assertions.

This research type is founded on the view that direct observation of phenomena is a proper way to measure reality and generate truth about the world (Bhattacharya, 2008). And by its name, it is a methodology in research that observes the rules of empiricism and uses quantitative and qualitative methods for gathering evidence.

For instance, a study is being conducted to determine if working from home helps in reducing stress from highly-demanding jobs. An experiment is conducted using two groups of employees, one working at their homes, the other working at the office. Each group was observed. The outcomes derived from this research will provide empirical evidence if working from home does help reduce stress or not. This also applies to entrepreneurs when they use a small business idea generator instead of manual procedures.

It was the ancient Greek medical practitioners who originated the term empirical ( empeirikos which means “experienced") when they began to deviate from the long-observed dogmatic principles to start depending on observed phenomena. Later on, empiricism pertained to a theory of knowledge in philosophy, which follows the belief that knowledge comes from evidence and experience derived particularly using the senses.

What ancient philosophers considered empirical research pertained to the reliance on observable data to design and test theories and reach conclusions. As such, empirical research is used to produce knowledge that is based on experience. At present, the word “empirical" pertains to the gathering of data using evidence that is derived through experience or observation or by using calibrated scientific tools.

Most of today’s outstanding empirical research outputs are published in prestigious journals. These scientific publications are considered high-impact journals because they publish research articles that tend to be the most cited in their fields.

II. Types and Methodologies of Empirical Research

Empirical research is done using either qualitative or quantitative methods.

Qualitative research Qualitative research methods are utilized for gathering non-numerical data. It is used to determine the underlying reasons, views, or meanings from study participants or subjects. Under the qualitative research design, empirical studies had evolved to test the conventional concepts of evidence and truth while still observing the fundamental principles of recognizing the subjects beings studied as empirical (Powner, 2015).

This method can be semi-structured or unstructured. Results from this research type are more descriptive than predictive. It allows the researcher to write a conclusion to support the hypothesis or theory being examined.

Due to realities like time and resources, the sample size of qualitative research is typically small. It is designed to offer in-depth information or more insight regarding the problem. Some of the most popular forms of methods are interviews, experiments, and focus groups.

Quantitative research   Quantitative research methods are used for gathering information via numerical data. This type is used to measure behavior, personal views, preferences, and other variables. Quantitative studies are in a more structured format, while the variables used are predetermined.

Data gathered from quantitative studies is analyzed to address the empirical questions. Some of the commonly used quantitative methods are polls, surveys, and longitudinal or cohort studies.

There are situations when using a single research method is not enough to adequately answer the questions being studied. In such cases, a combination of both qualitative and quantitative methods is necessary. Also, papers can also make use of both primary and secondary research methods

What Is Empirical Research? Definition, Types & Samples in 2024

III. Qualitative Empirical Research Methods

Some research question examples need to be gathered and analyzed qualitatively or quantitatively, depending on the nature of the study. These not only supply answers to empirical questions but also outline one’s scope of work . Here are the general types of qualitative research methods.

Observational Method

This involves observing and gathering data from study subjects. As a qualitative approach, observation is quite personal and time-intensive. It is often used in ethnographic studies to obtain empirical evidence.

The observational method is a part of the ethnographic research design, e.g., archival research, survey, etc. However, while it is commonly used for qualitative purposes, observation is also utilized for quantitative research, such as when observing measurable variables like weight, age, scale, etc.

One remarkable observational research was conducted by Abbott et al. (2016), a team of physicists from the Advanced Laser Interferometer Gravitational-Wave Observatory who examined the very first direct observation of gravitational waves. According to Google Scholar’s (2019) Metrics ranking, this study is among the most highly cited articles from the world’s most influential journals (Crew, 2019).

This method is exclusively qualitative and is one of the most widely used (Jamshed, 2014). Its popularity is mainly due to its ability to allow researchers to obtain precise, relevant information if the correct questions are asked.

This method is a form of a conversational approach, where in-depth data can be obtained. Interviews are commonly used in the social sciences and humanities, such as for interviewing resource persons.

This method is used to identify extensive information through an in-depth analysis of existing cases. It is typically used to obtain empirical evidence for investigating problems or business studies.

When conducting case studies, the researcher must carefully perform the empirical analysis, ensuring the variables and parameters in the current case are similar to the case being examined. From the findings of a case study, conclusions can be deduced about the topic being investigated.

Case studies are commonly used in studying the experience of organizations, groups of persons, geographic locations, etc.

Textual Analysis

This primarily involves the process of describing, interpreting, and understanding textual content. It typically seeks to connect the text to a broader artistic, cultural, political, or social context (Fairclough, 2003).

A relatively new research method, textual analysis is often used nowadays to elaborate on the trends and patterns of media content, especially social media. Data obtained from this approach are primarily used to determine customer buying habits and preferences for product development, and designing marketing campaigns.

Focus Groups

A focus group is a thoroughly planned discussion guided by a moderator and conducted to derive opinions on a designated topic. Essentially a group interview or collective conversation, this method offers a notably meaningful approach to think through particular issues or concerns (Kamberelis & Dimitriadis, 2011).

This research method is used when a researcher wants to know the answers to “how," “what," and “why" questions. Nowadays, focus groups are among the most widely used methods by consumer product producers for designing and/or improving products that people prefer.

IV. Quantitative Empirical Research Methods

Quantitative methods primarily help researchers to better analyze the gathered evidence. Here are the most common types of quantitative research techniques:

A research hypothesis is commonly tested using an experiment, which involves the creation of a controlled environment where the variables are maneuvered. Aside from determining the cause and effect, this method helps in knowing testing outcomes, such as when altering or removing variables.

Traditionally, experimental, laboratory-based research is used to advance knowledge in the physical and life sciences, including psychology. In recent decades, more and more social scientists are also adopting lab experiments (Falk & Heckman, 2009).

Survey research is designed to generate statistical data about a target audience (Fowler, 2014). Surveys can involve large, medium, or small populations and can either be a one-time event or a continuing process

Governments across the world are among the heavy users of continuing surveys, such as for census of populations or labor force surveys. This is a quantitative method that uses predetermined sets of closed questions that are easy to answer, thus enabling the gathering and analysis of large data sets.

In the past, surveys used to be expensive and time-consuming. But with the advancement in technology, new survey tools like social media and emails have made this research method easier and cheaper.

Causal-Comparative research

This method leverages the strength of comparison. It is primarily utilized to determine the cause and effect relationship among variables (Schenker & Rumrill, 2004).

For instance, a causal-comparative study measured the productivity of employees in an organization that allows remote work setup and compared that to the staff of another organization that does not offer work from home arrangements.

Cross-sectional research

While the observation method considers study subjects at a given point in time, cross-sectional research focuses on the similarity in all variables except the one being studied. 

This type does not allow for the determination of cause-effect relationships since subjects are now observed continuously. A cross-sectional study is often followed by longitudinal research to determine the precise causes. It is used mainly by pharmaceutical firms and retailers.

Longitudinal study

A longitudinal method of research is used for understanding the traits or behavior of a subject under observation after repeatedly testing the subject over a certain period of time. Data collected using this method can be qualitative or quantitative in nature. 

A commonly-used form of longitudinal research is the cohort study. For instance, in 1951, a cohort study called the British Doctors Study (Doll et al., 2004) was initiated, which compared smokers and non-smokers in the UK. The study continued through 2001. As early as 1956, the study gave undeniable proof of the direct link between smoking and the incidence of lung cancer.

Correlational research

This method is used to determine the relationships and prevalence among variables (Curtis et al., 2016). It commonly employs regression as the statistical treatment for predicting the study’s outcomes, which can only be a negative, neutral, or positive correlation.

A classic example of empirical research with correlational research is when studying if high education helps in obtaining better-paying jobs. If outcomes indicate that higher education does allow individuals to have high-salaried jobs, then it follows that people with less education tend to have lower-paying jobs.

What Is Empirical Research? Definition, Types & Samples in 2024

V. Steps for Conducting Empirical Research

Since empirical research is based on observation and capturing experiences, it is important to plan the steps to conduct the experiment and how to analyze it. This will enable the researcher to resolve problems or obstacles, which can occur during the experiment.

Step #1: Establishing the research objective

In this initial step, the researcher must be clear about what he or she precisely wants to do in the study. He or she should likewise frame the problem statement, plans of action, and determine any potential issues with the available resources, schedule, etc. for the research.

Most importantly, the researcher must be able to ascertain whether the study will be more beneficial than the cost it will incur.

Step #2: Reviewing relevant literature and supporting theories

The researcher must determine relevant theories or models to his or her research problem. If there are any such theories or models, they must understand how it can help in supporting the study outcomes.

Relevant literature must also be consulted. The researcher must be able to identify previous studies that examined similar problems or subjects, as well as determine the issues encountered.

Step #3: Framing the hypothesis and measurement

The researcher must frame an initial hypothesis or educated guess that could be the likely outcome. Variables must be established, along with the research context.

Units of measurements should also be defined, including the allowable margin of errors. The researcher must determine if the selected measures will be accepted by other scholars.

Step #4: Defining the research design, methodology, and data collection techniques

Before proceeding with the study, the researcher must establish an appropriate approach for the research. He or she must organize experiments to gather data that will allow him or her to frame the hypothesis.

The researcher should also decide whether he or she will use a nonexperimental or experimental technique to perform the study. Likewise, the  type of research design will depend on the type of study being conducted.

Finally, the researcher must determine the parameters that will influence the validity of the research design. Data gathering must be performed by selecting suitable samples based on the research question. After gathering the empirical data, the analysis follows.

Step #5: Conducting data analysis and framing the results

Data analysis is done either quantitatively or qualitatively. Depending on the nature of the study, the researcher must determine which method of data analysis is the appropriate one, or whether a combination of the two is suitable.

The outcomes of this step determine if the hypothesis is supported or rejected. This is why data analysis is considered as one of the most crucial steps in any research undertaking.

Step #6: Making conclusions

A report must be prepared in that it presents the findings and the entire research proceeding. If the researcher intends to disseminate his or her findings to a wider audience, the report will be converted into an article for publication. Aside from including the typical parts from the introduction and literature view, up to the methods, analysis, and conclusions, the researcher should also make recommendations for further research on his or her topic.

To ensure the originality and credibility of the report or research, it is essential to employ a plagiarism checker. By using a reliable plagiarism checker, the researcher can verify the uniqueness of their work and avoid any unintentional instances of plagiarism. This step helps maintain the integrity of the research and ensures that the recommendations for further research are based on the researcher’s own original insights. Incorporating a plagiarism checker into the writing process provides an additional layer of assurance and professionalism, enhancing the impact of the report or article in the academic community. Educators can also check the originality of their students’ research by utilizing a free plagiarism checker for teachers .

VI. Empirical Research Cycle

The empirical research cycle is composed of five phases, with each one considered as important as the next phase (de Groot, 1969). This rigorous and systematic method can consistently capture the process of framing hypotheses on how certain subjects behave or function and then testing them versus empirical data. It is considered to typify the deductive approach to science.

These are the five phases of the empirical research cycle:

1. Observation

During this initial phase, an idea is triggered for presenting a hypothesis. It involves the use of observation to gather empirical data. For example :

Consumers tend to consult first their smartphones before buying something in-store .

2. Induction

Inductive reasoning is then conducted to frame a general conclusion from the data gathered through observation. For example:

As mentioned earlier, most consumers tend to consult first their smartphones before buying something in-store .

A researcher may pose the question, “Does the tendency to use a smartphone indicate that today’s consumers need to be informed before making purchasing decisions?" The researcher can assume that is the case. Nonetheless, since it is still just a supposition, an experiment must be conducted to support or reject this hypothesis.

The researcher decided to conduct an online survey to inquire about the buying habits of a certain sample population of buyers at brick-and-mortar stores. This is to determine whether people always look at their smartphones first before making a purchase.

3. Deduction

This phase enables the researcher to figure out a conclusion out of the experiment. This must be based on rationality and logic in order to arrive at particular, unbiased outcomes. For example:

In the experiment, if a shopper consults first his or her smartphone before buying in-store, then it can be concluded that the shopper needs information to help him or her make informed buying decisions .

This phase involves the researcher going back to the empirical research steps to test the hypothesis. There is now the need to analyze and validate the data using appropriate statistical methods.

If the researcher confirms that in-store shoppers do consult their smartphones for product information before making a purchase, the researcher has found support for the hypothesis. However, it should be noted that this is just support of the hypothesis, not proof of a reality.

5. Evaluation

This phase is often neglected by many but is actually a crucial step to help keep expanding knowledge. During this stage, the researcher presents the gathered data, the supporting contention/s, and conclusions.

The researcher likewise puts forth the limitations of the study and his hypothesis. In addition, the researcher makes recommendations for further studies on the same topic with expanded variables.

What Is Empirical Research? Definition, Types & Samples in 2024

VII. Advantages and Disadvantages of Empirical Research

Since the time of the ancient Greeks, empirical research had been providing the world with numerous benefits. The following are a few of them:

  • Empirical research is used to validate previous research findings and frameworks.
  • It assumes a critical role in enhancing internal validity.
  • The degree of control is high, which enables the researcher to manage numerous variables.
  • It allows a researcher to comprehend the progressive changes that can occur, and thus enables him to modify an approach when needed.
  • Being based on facts and experience makes a research project more authentic and competent.

Disadvantages

Despite the many benefits it brings, empirical research is far from perfect. The following are some of its drawbacks:

  • Being evidence-based, data collection is a common problem especially when the research involves different sources and multiple methods.
  • It can be time-consuming, especially for longitudinal research.
  • Requesting permission to perform certain methods can be difficult, especially when a study involves human subjects.
  • Conducting research in multiple locations can be very expensive.
  • The propensity of even seasoned researchers to incorrectly interpret the statistical significance. For instance, Amrhein et al. (2019) made an analysis of 791 articles from five journals and found that half incorrectly interpreted that non-significance indicates zero effect.

VIII. Samples of Empirical Research

There are many types of empirical research. And, they can take many formsfrom basic research to action research like community project efforts. Here are some notable empirical research examples:

Professional Research

  • Research on Information Technology
  • Research on Infectious Diseases
  • Research on Occupational Health Psychology
  • Research on Infection Control
  • Research on Cancer
  • Research on Mathematical Science
  • Research on Environmental Science
  • Research on Genetics
  • Research on Climate Change
  • Research on Economics

Student Research

  • Thesis for B.S. in Computer Science & Engineering  
  • Thesis for B.S. in Geography
  • Thesis for B.S. in Architecture
  • Thesis for Master of Science in Electrical Engineering & Computer Science
  • Thesis for Master of Science in Artificial Intelligence
  • Thesis for Master of Science in Food Science and Nutrition
  • Dissertation for Ph.D. in Marketing  
  • Dissertation for Ph.D. in Social Work
  • Dissertation for Ph.D. in Urban Planning

Since ancient times until today, empirical research remains one of the most useful tools in man’s collective endeavor to unlock life’s mysteries. Using meaningful experience and observable evidence, this type of research will continue helping validate myriad hypotheses, test theoretical models, and advance various fields of specialization.

With new forms of deadly diseases and other problems continuing to plague man’s existence, finding effective medical interventions and relevant solutions had never been more important. This is among the reasons why empirical research had assumed a more prominent role in today’s society.

This article was able to discuss the different empirical research methods, the steps for conducting empirical research, the empirical research cycle, and notable examples. All of these contribute to supporting the larger societal cause to help understand how the world really works and make it a better place. Furthermore, being factually accurate is a big part of the criteria of good research , and it serves as the heart of empirical research.

Key Insights

  • Definition of Empirical Research: Empirical research is based on verifiable evidence derived from observation and experimentation, aiming to understand how the world works.
  • Origins: The concept of empirical research dates back to ancient Greek medical practitioners who relied on observed phenomena rather than dogmatic principles.
  • Types and Methods: Empirical research can be qualitative (e.g., interviews, case studies) or quantitative (e.g., surveys, experiments), depending on the nature of the data collected and the research question.
  • Empirical Research Cycle: Consists of observation, induction, deduction, testing, and evaluation, forming a systematic approach to generating and testing hypotheses.
  • Steps in Conducting Empirical Research: Includes establishing objectives, reviewing literature, framing hypotheses, designing methodology, collecting data, analyzing data, and making conclusions.
  • Advantages: Empirical research validates previous findings, enhances internal validity, allows for high control over variables, and is fact-based, making it authentic and competent.
  • Disadvantages: Data collection can be challenging and time-consuming, especially in longitudinal studies, and interpreting statistical significance can be problematic.
  • Applications: Used across various fields such as IT, infectious diseases, occupational health, environmental science, and economics. It is also prevalent in student research for theses and dissertations.
  • What is the primary goal of empirical research? The primary goal of empirical research is to generate knowledge about how the world works by relying on verifiable evidence obtained through observation and experimentation.
  • How does empirical research differ from theoretical research? Empirical research is based on observable and measurable evidence, while theoretical research involves abstract ideas and concepts without necessarily relying on real-world data.
  • What are the main types of empirical research methods? The main types of empirical research methods are qualitative (e.g., interviews, case studies, focus groups) and quantitative (e.g., surveys, experiments, cross-sectional studies).
  • Why is the empirical research cycle important? The empirical research cycle is important because it provides a structured and systematic approach to generating and testing hypotheses, ensuring that the research is thorough and reliable.
  • What are the steps involved in conducting empirical research? The steps involved in conducting empirical research include establishing the research objective, reviewing relevant literature, framing hypotheses, defining research design and methodology, collecting data, analyzing data, and making conclusions.
  • What are the advantages of empirical research? The advantages of empirical research include validating previous findings, enhancing internal validity, allowing for high control over variables, and being based on facts and experiences, making the research authentic and competent.
  • What are some common challenges in conducting empirical research? Common challenges in conducting empirical research include difficulties in data collection, time-consuming processes, obtaining permissions for certain methods, high costs, and potential misinterpretation of statistical significance.
  • In which fields is empirical research commonly used? Empirical research is commonly used in fields such as information technology, infectious diseases, occupational health, environmental science, economics, and various academic disciplines for student theses and dissertations.
  • Can empirical research use both qualitative and quantitative methods? Yes, empirical research can use both qualitative and quantitative methods, often combining them to provide a comprehensive understanding of the research problem.
  • What role does empirical research play in modern society? Empirical research plays a crucial role in modern society by validating hypotheses, testing theoretical models, and advancing knowledge across various fields, ultimately contributing to solving complex problems and improving the quality of life.
  • Abbott, B., Abbott, R., Abbott, T., Abernathy, M., & Acernese, F. (2016). Observation of Gravitational Waves from a Binary Black Hole Merger. Physical Review Letters, 116 (6), 061102. https://doi.org/10.1103/PhysRevLett.116.061102
  • Akpinar, E. (2014). Consumer Information Sharing: Understanding Psychological Drivers of Social Transmission . (Unpublished Ph.D. dissertation). Erasmus University Rotterdam, Rotterdam, The Netherlands.  http://hdl.handle.net/1765/1
  • Altmetric (2020). The 2019 Altmetric top 100. Altmetric .
  • Amrhein, V., Greenland, S., & McShane, B. (2019). Scientists rise up against statistical significance. Nature, 567 , 305-307.  https://doi.org/10.1038/d41586-019-00857-9
  • Amrhein, V., Trafimow, D., & Greenland, S. (2019). Inferential statistics as descriptive statistics: There is no replication crisis if we don’t expect replication. The American Statistician, 73 , 262-270. https://doi.org/10.1080/00031305.2018.1543137
  • Arute, F., Arya, K., Babbush, R. et al. (2019). Quantum supremacy using a programmable superconducting processor. Nature, 574 , 505510. https://doi.org/10.1038/s41586-019-1666-5
  • Bhattacharya, H. (2008). Empirical Research. In L. M. Given (ed.), The SAGE Encyclopedia of Qualitative Research Methods . Thousand Oaks, CA: Sage, 254-255.  https://dx.doi.org/10.4135/9781412963909.n133
  • Cohn, A., Maréchal, M., Tannenbaum, D., & Zund, C. (2019). Civic honesty around the globe. Science, 365 (6448), 70-73. https://doi.org/10.1126/science.aau8712
  • Corbin, J., & Strauss, A. (2015). Basics of Qualitative Research: Techniques and Procedures for Developing Grounded Theory, 4th ed . Thousand Oaks, CA: Sage. ISBN 978-1-4129-9746-1
  • Crew, B. (2019, August 2). Google Scholar reveals its most influential papers for 2019. Nature Index .
  • Curtis, E., Comiskey, C., & Dempsey, O. (2016). Importance and use of correlational research. Nurse Researcher, 23 (6), 20-25. https://doi.org/10.7748/nr.2016.e1382
  • Dashti, H., Jones, S., Wood, A., Lane, J., & van Hees, V., et al. (2019). Genome-wide association study identifies genetic loci for self-reported habitual sleep duration supported by accelerometer-derived estimates. Nature Communications, 10 (1).  https://doi.org/10.1038/s41467-019-08917-4
  • de Groot, A.D. (1969). Methodology: foundations of inference and research in the behavioral sciences. In  Psychological Studies, 6 . The Hague & Paris: Mouton & Co. Google Books
  • Doll, R., Peto, R., Boreham, J., & Isabelle Sutherland, I. (2004). Mortality in relation to smoking: 50 years’ observations on male British doctors. BMJ, 328  (7455), 1519-33. https://doi.org/10.1136/bmj.38142.554479.AE
  • Fairclough, N. (2003). Analyzing Discourse: Textual Analysis for Social Research . Abingdon-on-Thames: Routledge. Google Books
  • Falk, A., & Heckman, J. (2009). Lab experiments are a major source of knowledge in the social sciences. Science, 326 (5952), pp. 535-538. https://doi.org/10.1126/science.1168244
  • Fowler, F.J. (2014). Survey Research Methods, 5th ed . Thousand Oaks, CA: Sage. WorldCat
  • Gabriel, A., Manalo, M., Feliciano, R., Garcia, N., Dollete, U., & Paler J. (2018). A Candida parapsilosis inactivation-based UV-C process for calamansi (Citrus microcarpa) juice frink. LWT Food Science and Technology, 90 , 157-163. https://doi.org/10.1016/j.lwt.2017.12.020
  • Gallus, S., Bosetti, C., Negri, E., Talamini, R., Montella, M., et al. (2003). Does pizza protect against cancer? International Journal of Cancer, 107 (2), pp. 283-284. https://doi.org/10.1002/ijc.11382
  • Ganna, A., Verweij, K., Nivard, M., Maier, R., & Wedow, R. (2019). Large-scale GWAS reveals insights into the genetic architecture of same-sex sexual behavior. Science, 365 (6456). https://doi.org/10.1126/science.aat7693
  • Gedik, H., Voss, T., & Voss, A. (2013). Money and Transmission of Bacteria. Antimicrobial Resistance and Infection Control, 2 (2).  https://doi.org/10.1186/2047-2994-2-22
  • Gonzalez-Morales, M. G., Kernan, M. C., Becker, T. E., & Eisenberger, R. (2018). Defeating abusive supervision: Training supervisors to support subordinates. Journal of Occupational Health Psychology, 23  (2), 151-162. https://dx.doi.org/10.1037/ocp0000061
  • Google (2020). The 2019 Google Scholar Metrics Ranking . Google Scholar
  • Greenberg, D., Warrier, V., Allison, C., & Baron-Cohen, S. (2018). Testing the Empathizing-Systemising theory of sex differences and the Extreme Male Brain theory of autism in half a million people. PNAS, 115 (48), 12152-12157. https://doi.org/10.1073/pnas.1811032115
  • Grullon, D. (2019). Disentangling time constant and time-dependent hidden state in time series with variational Bayesian inference . (Unpublished master’s thesis). Massachusetts Institute of Technology, Cambridge, MA.  https://hdl.handle.net/1721.1/124572
  • He, K., Zhang, X., Ren, S., & Sun, J. (2016). Deep residual learning for image recognition. The IEEE Conference on Computer Vision and Pattern Recognition (CVPR) , 770-778. https://doi.org/10.1109/CVPR.2016.90
  • Hviid, A., Hansen, J., Frisch, M., & Melbye, M. (2019). Measles, mumps, rubella vaccination, and autism: A nationwide cohort study. Annals of Internal Medicine, 170 (8), 513-520. https://doi.org/10.7326/M18-2101
  • Jamshed, S. (2014). Qualitative research method-interviewing and observation. Journal of Basic and Clinical Pharmacy, 5 (4), 87-88. https://doi.org/10.4103/0976-0105.141942
  • Jamshidnejad, A. (2017). Efficient Predictive Model-Based and Fuzzy Control for Green Urban Mobility . (Unpublished Ph.D. dissertation). Delft University of Technology, Delft, Netherlands.  DUT
  • Kamberelis, G., & Dimitriadis, G. (2011). Focus groups: Contingent articulations of pedagogy, politics, and inquiry. In N. Denzin & Y. Lincoln (Eds.), The SAGE Handbook of Qualitative Research  (pp. 545-562). Thousand Oaks, CA: Sage. ISBN 978-1-4129-7417-2
  • Knowles-Smith, A. (2017). Refugees and theatre: an exploration of the basis of self-representation . (Unpublished undergraduate thesis). University College London, London, UK. UCL
  • Kulp, S.A., & Strauss, B.H. (2019). New elevation data triple estimates of global vulnerability to sea-level rise and coastal flooding. Nature Communications, 10 (4844), 1-12.  https://doi.org/10.1038/s41467-019-12808-z
  • LeCun, Y., Bengio, Y. & Hinton, G. (2015). Deep learning. Nature, 521 , 436444. https://doi.org/10.1038/nature14539
  • Levitt, H. M., Bamberg, M., Creswell, J. W., Frost, D. M., Josselson, R., & Suarez-Orozco, C. (2018). Journal article reporting standards for qualitative primary, qualitative meta-analytic, and mixed methods research in psychology: The APA Publications and Communications Board task force report.  American Psychologist, 73 (1), 26-46. https://doi.org/10.1037/amp0000151
  • Long, J., Shelhamer, E., & Darrell, T. (2015). Fully convolutional networks for semantic segmentation. 2015 IEEE Conference on Computer Vision and Pattern Recognition (CVPR) , 3431-3440. https://doi.org/10.1109/CVPR.2015.7298965
  • Martindell, N. (2014). DCDN: Distributed content delivery for the modern web . (Unpublished undergraduate thesis). University of Washington, Seattle, WA. CSE-UW
  • Mora, T. (2019). Transforming Parking Garages Into Affordable Housing . (Unpublished undergraduate thesis). University of Arkansas-Fayetteville, Fayetteville, AK. UARK
  • Ng, M., Fleming, T., Robinson, M., Thomson, B., & Graetz, N. (2014). Global, regional, and national prevalence of overweight and obesity in children and adults during 19802013: a systematic analysis for the Global Burden of Disease Study 2013. The Lancet, 384  (9945), 766-781. https://doi.org/10.1016/S0140-6736(14)60460-8
  • Ogden, C., Carroll, M., Kit, B., & Flegal, K. (2014). Prevalence of Childhood and Adult Obesity in the United States, 2011-2012. JAMA, 311 (8), 806-14. https://doi.org/10.1001/jama.2014.732
  • Powner, L. (2015). Empirical Research and Writing: A Political Science Student’s Practical Guide . Thousand Oaks, CA: Sage, 1-19.  https://dx.doi.org/10.4135/9781483395906
  • Ripple, W., Wolf, C., Newsome, T., Barnard, P., & Moomaw, W. (2020). World scientists’ warning of a climate emergency. BioScience, 70 (1), 8-12. https://doi.org/10.1093/biosci/biz088
  • Schenker, J., & Rumrill, P. (2004). Causal-comparative research designs. Journal of Vocational Rehabilitation, 21 (3), 117-121.
  • Shereen, M., Khan, S., Kazmi, A., Bashir, N., & Siddique, R. (2020). COVID-19 infection: Origin, transmission, and characteristics of human coronaviruses. Journal of Advanced Research, 24 , 91-98.  https://doi.org/10.1016/j.jare.2020.03.005
  • Sipola, C. (2017). Summarizing electricity usage with a neural network . (Unpublished master’s thesis). University of Edinburgh, Edinburgh, Scotland. Project-Archive
  • Szegedy, C., Liu, W., Jia, Y., Sermanet, P., Reed, S., Anguelov, D., Erhan, D., Vanhoucke, V., & Rabinovich, A. (2015). Going deeper with convolutions. 2015 IEEE Conference on Computer Vision and Pattern Recognition (CVPR) , 1-9. https://doi.org/10.1109/CVPR.2015.7298594
  • Taylor, S. (2017). Effacing and Obscuring Autonomy: the Effects of Structural Violence on the Transition to Adulthood of Street Involved Youth . (Unpublished Ph.D. dissertation). University of Ottawa, Ottawa, Canada. UOttawa
  • Vosoughi, S., Roy, D., & Aral, S. (2018). The spread of true and false news online. Science, 359 (6380), 1146-1151. https://doi.org/10.1126/science.aap9559

Related Articles

How to Write a Thesis Statement for a Research Paper in 2024: Steps and Examples thumbnail

How to Write a Thesis Statement for a Research Paper in 2024: Steps and Examples

How to Write a Research Paper Abstract in 2024: Guide With Examples thumbnail

How to Write a Research Paper Abstract in 2024: Guide With Examples

72 Scholarship Statistics: 2024 Data, Facts & Analysis thumbnail

72 Scholarship Statistics: 2024 Data, Facts & Analysis

What Is a University Dissertation: 2024 Structure, Challenges & Writing Tips thumbnail

What Is a University Dissertation: 2024 Structure, Challenges & Writing Tips

Web-Based Research: Tips For Conducting Academic Research thumbnail

Web-Based Research: Tips For Conducting Academic Research

How to Write Research Methodology in 2024: Overview, Tips, and Techniques thumbnail

How to Write Research Methodology in 2024: Overview, Tips, and Techniques

EasyChair : Tutorial of how to request an installation for Conference Management System thumbnail

EasyChair : Tutorial of how to request an installation for Conference Management System

Levels of Evidence in Research: Examples, Hierachies & Practice in 2024 thumbnail

Levels of Evidence in Research: Examples, Hierachies & Practice in 2024

How to Write a Research Question in 2024: Types, Steps, and Examples thumbnail

How to Write a Research Question in 2024: Types, Steps, and Examples

How to Write a Research Proposal in 2024: Structure, Examples & Common Mistakes thumbnail

How to Write a Research Proposal in 2024: Structure, Examples & Common Mistakes

Needs Analysis in 2024: Definition, Importance & Implementation thumbnail

Needs Analysis in 2024: Definition, Importance & Implementation

Importing references from google scholar to bibtex.

How to Become a Preschool Teacher in Georgia: Requirements & Certification in 2024 thumbnail

How to Become a Preschool Teacher in Georgia: Requirements & Certification in 2024

How to Become a Preschool Teacher in Alaska: Requirements & Certification in 2024 thumbnail

How to Become a Preschool Teacher in Alaska: Requirements & Certification in 2024

Most Affordable Online Master's in Artificial Intelligence Degree Programs for 2024 thumbnail

Most Affordable Online Master's in Artificial Intelligence Degree Programs for 2024

How to Become a Preschool Teacher in Kansas: Requirements & Certification in 2024 thumbnail

How to Become a Preschool Teacher in Kansas: Requirements & Certification in 2024

How to Become a Preschool Teacher in Missouri: Requirements & Certification in 2024 thumbnail

How to Become a Preschool Teacher in Missouri: Requirements & Certification in 2024

How to Become a Preschool Teacher in Maryland: Requirements & Certification in 2024 thumbnail

How to Become a Preschool Teacher in Maryland: Requirements & Certification in 2024

How to Become a Preschool Teacher in Ohio: Requirements & Certification in 2024 thumbnail

How to Become a Preschool Teacher in Ohio: Requirements & Certification in 2024

How to Become a Preschool Teacher in Kentucky: Requirements & Certification in 2024 thumbnail

How to Become a Preschool Teacher in Kentucky: Requirements & Certification in 2024

How to Become a Preschool Teacher in Hawaii: Requirements & Certification in 2024 thumbnail

How to Become a Preschool Teacher in Hawaii: Requirements & Certification in 2024

Recently published articles.

Best Master’s in Speech Language Pathology Programs for 2024

Best Master’s in Speech Language Pathology Programs for 2024

Best Nurse Practitioner Programs in Alabama 2024 – Accredited Schools Online & Campus

Best Nurse Practitioner Programs in Alabama 2024 – Accredited Schools Online & Campus

Most In-Demand Education Careers in 2024

Most In-Demand Education Careers in 2024

How to Become a Licensed Counselor (LPC) in Kentucky in 2024

How to Become a Licensed Counselor (LPC) in Kentucky in 2024

How to Become a Licensed Counselor (LPC) in Maine in 2024

How to Become a Licensed Counselor (LPC) in Maine in 2024

Best Online Associate Degrees in Computer Science: A Guide to Online Programs for 2024

Best Online Associate Degrees in Computer Science: A Guide to Online Programs for 2024

Best Nurse Practitioner Programs in Montana 2024 – Accredited Schools Online & Campus

Best Nurse Practitioner Programs in Montana 2024 – Accredited Schools Online & Campus

Best Nurse Practitioner Programs in Indiana 2024 - Accredited Schools Online & Campus

Best Nurse Practitioner Programs in Indiana 2024 - Accredited Schools Online & Campus

Best 12-Month FNP Programs (Online & Campus) for 2024

Best 12-Month FNP Programs (Online & Campus) for 2024

Best Shortest Online PMHNP Certificate Programs for 2024

Best Shortest Online PMHNP Certificate Programs for 2024

Best Online Master’s Program in Forensic Psychology: Guide to Online Programs  for 2024

Best Online Master’s Program in Forensic Psychology: Guide to Online Programs for 2024

What Can You Do with a Digital Marketing Degree: 2024 Costs & Job Opportunities

What Can You Do with a Digital Marketing Degree: 2024 Costs & Job Opportunities

Best Nursing Certifications to Consider for 2024

Best Nursing Certifications to Consider for 2024

Linguistics Jobs in 2024: Careers, Salary Range, and Requirements

Linguistics Jobs in 2024: Careers, Salary Range, and Requirements

Best Online Radiologic Technology Programs in 2024

Best Online Radiologic Technology Programs in 2024

Most Affordable Online Doctorates in Education for 2024

Most Affordable Online Doctorates in Education for 2024

Best Psychology Schools in Wisconsin – 2024 Accredited Colleges & Programs

Best Psychology Schools in Wisconsin – 2024 Accredited Colleges & Programs

Best Online Nursing Programs in North Carolina – 2024 Accredited RN to BSN Programs

Best Online Nursing Programs in North Carolina – 2024 Accredited RN to BSN Programs

Top 50 US Colleges that Pay Off the Most in 2024

Top 50 US Colleges that Pay Off the Most in 2024

How to Become a Middle School Math Teacher in Georgia: Requirements & Certification in 2024

How to Become a Middle School Math Teacher in Georgia: Requirements & Certification in 2024

Best LPN Programs in Oregon – Accredited Online LPN Programs in 2024

Best LPN Programs in Oregon – Accredited Online LPN Programs in 2024

Newsletter & conference alerts.

Research.com uses the information to contact you about our relevant content. For more information, check out our privacy policy .

Newsletter confirmation

Thank you for subscribing!

Confirmation email sent. Please click the link in the email to confirm your subscription.

  • Skip to main content
  • Skip to primary sidebar
  • Skip to footer
  • QuestionPro

survey software icon

  • Solutions Industries Gaming Automotive Sports and events Education Government Travel & Hospitality Financial Services Healthcare Cannabis Technology Use Case AskWhy Communities Audience Contactless surveys Mobile LivePolls Member Experience GDPR Positive People Science 360 Feedback Surveys
  • Resources Blog eBooks Survey Templates Case Studies Training Help center

what is empirical research economics

Home Market Research

Empirical Research: Definition, Methods, Types and Examples

What is Empirical Research

Content Index

Empirical research: Definition

Empirical research: origin, quantitative research methods, qualitative research methods, steps for conducting empirical research, empirical research methodology cycle, advantages of empirical research, disadvantages of empirical research, why is there a need for empirical research.

Empirical research is defined as any research where conclusions of the study is strictly drawn from concretely empirical evidence, and therefore “verifiable” evidence.

This empirical evidence can be gathered using quantitative market research and  qualitative market research  methods.

For example: A research is being conducted to find out if listening to happy music in the workplace while working may promote creativity? An experiment is conducted by using a music website survey on a set of audience who are exposed to happy music and another set who are not listening to music at all, and the subjects are then observed. The results derived from such a research will give empirical evidence if it does promote creativity or not.

LEARN ABOUT: Behavioral Research

You must have heard the quote” I will not believe it unless I see it”. This came from the ancient empiricists, a fundamental understanding that powered the emergence of medieval science during the renaissance period and laid the foundation of modern science, as we know it today. The word itself has its roots in greek. It is derived from the greek word empeirikos which means “experienced”.

In today’s world, the word empirical refers to collection of data using evidence that is collected through observation or experience or by using calibrated scientific instruments. All of the above origins have one thing in common which is dependence of observation and experiments to collect data and test them to come up with conclusions.

LEARN ABOUT: Causal Research

Types and methodologies of empirical research

Empirical research can be conducted and analysed using qualitative or quantitative methods.

  • Quantitative research : Quantitative research methods are used to gather information through numerical data. It is used to quantify opinions, behaviors or other defined variables . These are predetermined and are in a more structured format. Some of the commonly used methods are survey, longitudinal studies, polls, etc
  • Qualitative research:   Qualitative research methods are used to gather non numerical data.  It is used to find meanings, opinions, or the underlying reasons from its subjects. These methods are unstructured or semi structured. The sample size for such a research is usually small and it is a conversational type of method to provide more insight or in-depth information about the problem Some of the most popular forms of methods are focus groups, experiments, interviews, etc.

Data collected from these will need to be analysed. Empirical evidence can also be analysed either quantitatively and qualitatively. Using this, the researcher can answer empirical questions which have to be clearly defined and answerable with the findings he has got. The type of research design used will vary depending on the field in which it is going to be used. Many of them might choose to do a collective research involving quantitative and qualitative method to better answer questions which cannot be studied in a laboratory setting.

LEARN ABOUT: Qualitative Research Questions and Questionnaires

Quantitative research methods aid in analyzing the empirical evidence gathered. By using these a researcher can find out if his hypothesis is supported or not.

  • Survey research: Survey research generally involves a large audience to collect a large amount of data. This is a quantitative method having a predetermined set of closed questions which are pretty easy to answer. Because of the simplicity of such a method, high responses are achieved. It is one of the most commonly used methods for all kinds of research in today’s world.

Previously, surveys were taken face to face only with maybe a recorder. However, with advancement in technology and for ease, new mediums such as emails , or social media have emerged.

For example: Depletion of energy resources is a growing concern and hence there is a need for awareness about renewable energy. According to recent studies, fossil fuels still account for around 80% of energy consumption in the United States. Even though there is a rise in the use of green energy every year, there are certain parameters because of which the general population is still not opting for green energy. In order to understand why, a survey can be conducted to gather opinions of the general population about green energy and the factors that influence their choice of switching to renewable energy. Such a survey can help institutions or governing bodies to promote appropriate awareness and incentive schemes to push the use of greener energy.

Learn more: Renewable Energy Survey Template Descriptive Research vs Correlational Research

  • Experimental research: In experimental research , an experiment is set up and a hypothesis is tested by creating a situation in which one of the variable is manipulated. This is also used to check cause and effect. It is tested to see what happens to the independent variable if the other one is removed or altered. The process for such a method is usually proposing a hypothesis, experimenting on it, analyzing the findings and reporting the findings to understand if it supports the theory or not.

For example: A particular product company is trying to find what is the reason for them to not be able to capture the market. So the organisation makes changes in each one of the processes like manufacturing, marketing, sales and operations. Through the experiment they understand that sales training directly impacts the market coverage for their product. If the person is trained well, then the product will have better coverage.

  • Correlational research: Correlational research is used to find relation between two set of variables . Regression analysis is generally used to predict outcomes of such a method. It can be positive, negative or neutral correlation.

LEARN ABOUT: Level of Analysis

For example: Higher educated individuals will get higher paying jobs. This means higher education enables the individual to high paying job and less education will lead to lower paying jobs.

  • Longitudinal study: Longitudinal study is used to understand the traits or behavior of a subject under observation after repeatedly testing the subject over a period of time. Data collected from such a method can be qualitative or quantitative in nature.

For example: A research to find out benefits of exercise. The target is asked to exercise everyday for a particular period of time and the results show higher endurance, stamina, and muscle growth. This supports the fact that exercise benefits an individual body.

  • Cross sectional: Cross sectional study is an observational type of method, in which a set of audience is observed at a given point in time. In this type, the set of people are chosen in a fashion which depicts similarity in all the variables except the one which is being researched. This type does not enable the researcher to establish a cause and effect relationship as it is not observed for a continuous time period. It is majorly used by healthcare sector or the retail industry.

For example: A medical study to find the prevalence of under-nutrition disorders in kids of a given population. This will involve looking at a wide range of parameters like age, ethnicity, location, incomes  and social backgrounds. If a significant number of kids coming from poor families show under-nutrition disorders, the researcher can further investigate into it. Usually a cross sectional study is followed by a longitudinal study to find out the exact reason.

  • Causal-Comparative research : This method is based on comparison. It is mainly used to find out cause-effect relationship between two variables or even multiple variables.

For example: A researcher measured the productivity of employees in a company which gave breaks to the employees during work and compared that to the employees of the company which did not give breaks at all.

LEARN ABOUT: Action Research

Some research questions need to be analysed qualitatively, as quantitative methods are not applicable there. In many cases, in-depth information is needed or a researcher may need to observe a target audience behavior, hence the results needed are in a descriptive analysis form. Qualitative research results will be descriptive rather than predictive. It enables the researcher to build or support theories for future potential quantitative research. In such a situation qualitative research methods are used to derive a conclusion to support the theory or hypothesis being studied.

LEARN ABOUT: Qualitative Interview

  • Case study: Case study method is used to find more information through carefully analyzing existing cases. It is very often used for business research or to gather empirical evidence for investigation purpose. It is a method to investigate a problem within its real life context through existing cases. The researcher has to carefully analyse making sure the parameter and variables in the existing case are the same as to the case that is being investigated. Using the findings from the case study, conclusions can be drawn regarding the topic that is being studied.

For example: A report mentioning the solution provided by a company to its client. The challenges they faced during initiation and deployment, the findings of the case and solutions they offered for the problems. Such case studies are used by most companies as it forms an empirical evidence for the company to promote in order to get more business.

  • Observational method:   Observational method is a process to observe and gather data from its target. Since it is a qualitative method it is time consuming and very personal. It can be said that observational research method is a part of ethnographic research which is also used to gather empirical evidence. This is usually a qualitative form of research, however in some cases it can be quantitative as well depending on what is being studied.

For example: setting up a research to observe a particular animal in the rain-forests of amazon. Such a research usually take a lot of time as observation has to be done for a set amount of time to study patterns or behavior of the subject. Another example used widely nowadays is to observe people shopping in a mall to figure out buying behavior of consumers.

  • One-on-one interview: Such a method is purely qualitative and one of the most widely used. The reason being it enables a researcher get precise meaningful data if the right questions are asked. It is a conversational method where in-depth data can be gathered depending on where the conversation leads.

For example: A one-on-one interview with the finance minister to gather data on financial policies of the country and its implications on the public.

  • Focus groups: Focus groups are used when a researcher wants to find answers to why, what and how questions. A small group is generally chosen for such a method and it is not necessary to interact with the group in person. A moderator is generally needed in case the group is being addressed in person. This is widely used by product companies to collect data about their brands and the product.

For example: A mobile phone manufacturer wanting to have a feedback on the dimensions of one of their models which is yet to be launched. Such studies help the company meet the demand of the customer and position their model appropriately in the market.

  • Text analysis: Text analysis method is a little new compared to the other types. Such a method is used to analyse social life by going through images or words used by the individual. In today’s world, with social media playing a major part of everyone’s life, such a method enables the research to follow the pattern that relates to his study.

For example: A lot of companies ask for feedback from the customer in detail mentioning how satisfied are they with their customer support team. Such data enables the researcher to take appropriate decisions to make their support team better.

Sometimes a combination of the methods is also needed for some questions that cannot be answered using only one type of method especially when a researcher needs to gain a complete understanding of complex subject matter.

We recently published a blog that talks about examples of qualitative data in education ; why don’t you check it out for more ideas?

Learn More: Data Collection Methods: Types & Examples

Since empirical research is based on observation and capturing experiences, it is important to plan the steps to conduct the experiment and how to analyse it. This will enable the researcher to resolve problems or obstacles which can occur during the experiment.

Step #1: Define the purpose of the research

This is the step where the researcher has to answer questions like what exactly do I want to find out? What is the problem statement? Are there any issues in terms of the availability of knowledge, data, time or resources. Will this research be more beneficial than what it will cost.

Before going ahead, a researcher has to clearly define his purpose for the research and set up a plan to carry out further tasks.

Step #2 : Supporting theories and relevant literature

The researcher needs to find out if there are theories which can be linked to his research problem . He has to figure out if any theory can help him support his findings. All kind of relevant literature will help the researcher to find if there are others who have researched this before, or what are the problems faced during this research. The researcher will also have to set up assumptions and also find out if there is any history regarding his research problem

Step #3: Creation of Hypothesis and measurement

Before beginning the actual research he needs to provide himself a working hypothesis or guess what will be the probable result. Researcher has to set up variables, decide the environment for the research and find out how can he relate between the variables.

Researcher will also need to define the units of measurements, tolerable degree for errors, and find out if the measurement chosen will be acceptable by others.

Step #4: Methodology, research design and data collection

In this step, the researcher has to define a strategy for conducting his research. He has to set up experiments to collect data which will enable him to propose the hypothesis. The researcher will decide whether he will need experimental or non experimental method for conducting the research. The type of research design will vary depending on the field in which the research is being conducted. Last but not the least, the researcher will have to find out parameters that will affect the validity of the research design. Data collection will need to be done by choosing appropriate samples depending on the research question. To carry out the research, he can use one of the many sampling techniques. Once data collection is complete, researcher will have empirical data which needs to be analysed.

LEARN ABOUT: Best Data Collection Tools

Step #5: Data Analysis and result

Data analysis can be done in two ways, qualitatively and quantitatively. Researcher will need to find out what qualitative method or quantitative method will be needed or will he need a combination of both. Depending on the unit of analysis of his data, he will know if his hypothesis is supported or rejected. Analyzing this data is the most important part to support his hypothesis.

Step #6: Conclusion

A report will need to be made with the findings of the research. The researcher can give the theories and literature that support his research. He can make suggestions or recommendations for further research on his topic.

Empirical research methodology cycle

A.D. de Groot, a famous dutch psychologist and a chess expert conducted some of the most notable experiments using chess in the 1940’s. During his study, he came up with a cycle which is consistent and now widely used to conduct empirical research. It consists of 5 phases with each phase being as important as the next one. The empirical cycle captures the process of coming up with hypothesis about how certain subjects work or behave and then testing these hypothesis against empirical data in a systematic and rigorous approach. It can be said that it characterizes the deductive approach to science. Following is the empirical cycle.

  • Observation: At this phase an idea is sparked for proposing a hypothesis. During this phase empirical data is gathered using observation. For example: a particular species of flower bloom in a different color only during a specific season.
  • Induction: Inductive reasoning is then carried out to form a general conclusion from the data gathered through observation. For example: As stated above it is observed that the species of flower blooms in a different color during a specific season. A researcher may ask a question “does the temperature in the season cause the color change in the flower?” He can assume that is the case, however it is a mere conjecture and hence an experiment needs to be set up to support this hypothesis. So he tags a few set of flowers kept at a different temperature and observes if they still change the color?
  • Deduction: This phase helps the researcher to deduce a conclusion out of his experiment. This has to be based on logic and rationality to come up with specific unbiased results.For example: In the experiment, if the tagged flowers in a different temperature environment do not change the color then it can be concluded that temperature plays a role in changing the color of the bloom.
  • Testing: This phase involves the researcher to return to empirical methods to put his hypothesis to the test. The researcher now needs to make sense of his data and hence needs to use statistical analysis plans to determine the temperature and bloom color relationship. If the researcher finds out that most flowers bloom a different color when exposed to the certain temperature and the others do not when the temperature is different, he has found support to his hypothesis. Please note this not proof but just a support to his hypothesis.
  • Evaluation: This phase is generally forgotten by most but is an important one to keep gaining knowledge. During this phase the researcher puts forth the data he has collected, the support argument and his conclusion. The researcher also states the limitations for the experiment and his hypothesis and suggests tips for others to pick it up and continue a more in-depth research for others in the future. LEARN MORE: Population vs Sample

LEARN MORE: Population vs Sample

There is a reason why empirical research is one of the most widely used method. There are a few advantages associated with it. Following are a few of them.

  • It is used to authenticate traditional research through various experiments and observations.
  • This research methodology makes the research being conducted more competent and authentic.
  • It enables a researcher understand the dynamic changes that can happen and change his strategy accordingly.
  • The level of control in such a research is high so the researcher can control multiple variables.
  • It plays a vital role in increasing internal validity .

Even though empirical research makes the research more competent and authentic, it does have a few disadvantages. Following are a few of them.

  • Such a research needs patience as it can be very time consuming. The researcher has to collect data from multiple sources and the parameters involved are quite a few, which will lead to a time consuming research.
  • Most of the time, a researcher will need to conduct research at different locations or in different environments, this can lead to an expensive affair.
  • There are a few rules in which experiments can be performed and hence permissions are needed. Many a times, it is very difficult to get certain permissions to carry out different methods of this research.
  • Collection of data can be a problem sometimes, as it has to be collected from a variety of sources through different methods.

LEARN ABOUT:  Social Communication Questionnaire

Empirical research is important in today’s world because most people believe in something only that they can see, hear or experience. It is used to validate multiple hypothesis and increase human knowledge and continue doing it to keep advancing in various fields.

For example: Pharmaceutical companies use empirical research to try out a specific drug on controlled groups or random groups to study the effect and cause. This way, they prove certain theories they had proposed for the specific drug. Such research is very important as sometimes it can lead to finding a cure for a disease that has existed for many years. It is useful in science and many other fields like history, social sciences, business, etc.

LEARN ABOUT: 12 Best Tools for Researchers

With the advancement in today’s world, empirical research has become critical and a norm in many fields to support their hypothesis and gain more knowledge. The methods mentioned above are very useful for carrying out such research. However, a number of new methods will keep coming up as the nature of new investigative questions keeps getting unique or changing.

Create a single source of real data with a built-for-insights platform. Store past data, add nuggets of insights, and import research data from various sources into a CRM for insights. Build on ever-growing research with a real-time dashboard in a unified research management platform to turn insights into knowledge.

LEARN MORE         FREE TRIAL

MORE LIKE THIS

SWOT analysis

SWOT Analysis: What It Is And How To Do It?

Sep 27, 2024

SurveySparrow vs surveymonkey

SurveySparrow vs SurveyMonkey: Choosing the Right Survey Tool

Sep 26, 2024

User Behavior Analytics

User Behavior Analytics: What it is, Importance, Uses & Tools

data security

Data Security: What it is, Types, Risk & Strategies to Follow

Sep 25, 2024

Other categories

  • Academic Research
  • Artificial Intelligence
  • Assessments
  • Brand Awareness
  • Case Studies
  • Communities
  • Consumer Insights
  • Customer effort score
  • Customer Engagement
  • Customer Experience
  • Customer Loyalty
  • Customer Research
  • Customer Satisfaction
  • Employee Benefits
  • Employee Engagement
  • Employee Retention
  • Friday Five
  • General Data Protection Regulation
  • Insights Hub
  • Life@QuestionPro
  • Market Research
  • Mobile diaries
  • Mobile Surveys
  • New Features
  • Online Communities
  • Question Types
  • Questionnaire
  • QuestionPro Products
  • Release Notes
  • Research Tools and Apps
  • Revenue at Risk
  • Survey Templates
  • Training Tips
  • Tuesday CX Thoughts (TCXT)
  • Uncategorized
  • What’s Coming Up
  • Workforce Intelligence

Researching and writing for Economics students

3 economics: methods, approaches, fields and relevant questions, 3.1 economic theory and empirical work: what is it.

What is economic theory and what can it do?

Unlike “theory” in some other social science disciplines, economic theory is mostly based on mathematical modelling and rigorous proof that certain conclusions or results can be derived from certain assumptions. But theory alone can say little about the real world.

In Economics: Models = Theory = Mathematics… for the most part.

What is empirical work and what can it do?

In contrast, empirical work gathers evidence from the real world, usually organized into systematic data sets.

It tries to bring evidence to refute or substantiate economics theory,

It tries to estimate parameters such as price elasticity or the government spending multiplier in specific contexts

It rigorously presents broad “stylized facts”, providing a clear picture of a market, industry, or situation

Much empirical work itself relies on assumptions, either assumptions from economic theory, or assumptions about the data itself, or both. But empirical work does not “prove” anything. Instead, it presents evidence in favour of or against certain hypotheses, estimates parameters, and can, using the classical statistical framework, reject (or fail to reject) certain null hypotheses. What “rejecting” means is “if the assumptions underlying my estimation technique are correct, then it is highly unlikely that the null hypothesis holds.”

3.2 Normative vs. Positive

The word ‘Normative’, also called ‘prescriptive’, often refers to what ought to be, what an ideal policy would be, or how to think about judging whether this is a justifiable welfare function.

“Positive” work claims to be value-neutral and to address what is or what must be going on in the real world. Most modern economists would probably claim their work is “positive”, and in this sense, “prescriptive” is often used as a pejorative, In my experience. However, prescriptive papers can be very valuable if done well.

Note: There is also another context in which you will hear the expression ‘normative analysis.’ This may also be used to describe microeconomic analysis derived from the axioms of rational optimising behavior; this describes much of what you have covered in your textbook. This dual meaning of the word ‘normative’ is admittedly confusing!

3.3 Theoretical vs. Empirical (techniques)

Papers that use theory (modeling) as a technique typically start from a series of assumptions and try to derive results simply from these assumptions. They may motivate their focus or assumptions using previous empirical work and anecdotes, but these papers do not use themselves data nor do they do what we call “econometrics”. Remember that in Economics “theory papers” are usually highly mathematical and formal.

Empirical papers use evidence from the real world, usually to test hypotheses, but also to generate description and help formulate ideas and hypotheses.

3.4 Theoretical vs. Applied (focus)

“Theoretical” can also be used to describe a paper’s focus; a theoretical paper in this sense will address fundamentals of economic modeling. In theory, these may be widely applied across a range of fields, but they do not typically address any single issue of policy or focus on a specific industry. These papers are often very difficult to read and there is argument about whether many such papers will ultimately “trickle-down” to having practical use. These papers typically used theory and modeling techniques rather than empirics. However some empirical papers may be aimed at addressing fundamental theoretical issues and parameters.

Papers with an “applied” focus will directly target a policy issue or a puzzle or question about the functioning of certain market or nonmarket interactions. Nearly all of the papers you will read and work on as an undergraduate are “applied” in this sense.

3.5 Categories of empirical approaches

“causal” vs. “descriptive”.

“Causal” papers try to get at whether one factor or event “A” can be seen to directly “cause” an outcome “B”. For example, “does an individual getting more years of schooling lead him or her to have higher income, on average?” A good way to think about this conception of causality is to consider the counterfactual: if a typical person who received a university degree had been randomly selected to not get this education, would her income have been lower than it now is? Similarly (but not identically) if the typical person without his education had been randomly input into a university program, would her income now be greater?

Since the real world does not usually present such clean experiments, “causal” empirical researchers rely on various techniques which usually themselves depend on" identification assumptions." See, for example, control strategies, difference in difference, and instrumental variables techniques.

“Descriptive” papers essentially aim to present a picture of “what the data looks like” in an informative way. Causal relationships may be suggested but the authors are not making a strong claim that they can identify these. They may present a data-driven portrait of an industry, of wealth and inequality in a country or globally over time, of particular patterns and trends in consumption, of a panel of governments’ monetary and fiscal policy, etc. They may focus on the ‘functional form’ of relationships in the data and the ‘residual’ or ‘error structure. They may hint at causal relationships or propose a governing model. They may identify a ’puzzle’ in the data (e.g., the ‘equity premium puzzle’) and propose potential explanations. They may use the data to ‘provide support’ for these explanations. 5 They may devote much of the paper to providing a theoretical explanation (remember, in Economics these are usually mathematical models) for the pattern. They may also run statistical tests and report confidence intervals; one can establish a ‘statistically significant’ relationship between two variables even if the relationship is not (necessarily) causal. This is particular important when one sees the data as subject to measurement error and/or as a sample from a larger population. E.g., just because age and wealth (or height and head-size, or political affiliation and food-preference) are strongly related to one another in a random representative sample of 10 people does not mean they are strongly related to one another in the entire population . 6

Structural vs. Reduced Form

This is a rather complicated issue, and there are long debates over the merits of each approach.

In brief, structural empirical papers might be said to use theory to derive necessary relationships between variables and appropriate functional forms, often as part of a system of questions describing a broad model. They then “take this model to the data”, and estimate certain parameters; these estimates rely on the key structural assumptions and the chosen functional form (which is often selected for convenience) holding in the real world. They may also try to check how “robust” the estimates are to alternate assumptions and forms. Structural estimates can then be used to make precise predictions and welfare calculations.

Reduced form work may begin with some theoretical modeling but it will not usually try to estimate the model directly. Reduced form work often involves estimating single equations which may be “partial equilibrium”, and they may often use linear regression and interpret it as a “best linear approximation” to the true unknown functional form. Reduced form researchers often claim that results are “more robust” than structural work, while proponents of structural work may claim that reduced form econometrics is not theoretically grounded and thus meaningless.

Most of you are likely to focus on reduced form empirical work.

Quantitative vs. qualitative (the latter is rare in economics)

Quantitative research deals with data that can be quantified, i.e., expressed in terms of numbers and strict categories, often with hierarchical relationships.

Qualitative research is rarely done in modern economics. It relies on “softer” forms of data like interviews that cannot be reduced to a number or parameter, and cannot be dealt with using statistical techniques.

3.6 Methodological research

Methodological research is aimed at producing and evaluating techniques and approaches that can be used by other researchers. Most methodological research in economics is done by econometricians, who develop and evaluate techniques for estimating relationships using data.

3.7 Fields of economics, and some classic questions asked in each field

Economics is about choices under conditions of scarcity, the interaction of individuals governments and firms, and the consequences of these. [citation needed]

Microeconomics

Preferences and choices under constraints ; e.g., “how do risk-averse individuals choose among a set of uncertain gambles?” … “How does consumption of leisure change in response to an increase in the VAT?”

Game theory, interactions ; … “How do individuals coordinate in ‘stag hunt’ games, and are these equilibria robust to small errors?”

Mechanism design and contract theory ; … “How can a performance scheme be designed to induce the optimal level of effort with asymmetric information about ability?”

Equilibrium ; … “Is the general equilibrium of an economy with indivisible goods Pareto optimal?”

Macroeconomics

Stabilisation ; … “how do changes in the level of government spending affect the changes in the rate of unemployment?”

Growth ; …“Why did GDP per capita increase in Western Europe between 1950 and 1980?”

Aggregates, stocks, and flows ; … “Does a trade deficit lead to the government budget deficit, or vice/versa, (or both, or neither)?”

Money and Banking … “Does deposit insurance decrease the likelihood of a bank run?”

Financial Economics (not as broad as the first two)

“Can an investor use publicly-available information be used to systematically earn supernormal profits?” (the Efficient Markets Hypothesis)

Econometrics (methods/technique)

“What is the lowest mean squared error unbiased estimator of a gravity equation?”

Experimental economics (a technique)

Do laboratory subjects (usually students) coordinate on the efficient equilibrium in `stag hunt’ games? Do stronger incentives increase their likelihood this type of play?

Behavioural economics (an alternate approach to micro)

“Can individual choices over time be rationalised by standard exponential discounting, or do they follow another model, such as time inconsistent preferences and hyperbolic discounting?”

Applied fields

Development.

“Has the legacy of British institutions increased or decrease the level of GDP in former colonies?”

“Do greater unemployment benefits increase the length of an unemployment spell, and if so, to what extent?”

“Does public support for education increase or decrease income inequality?”

“Why did the industrial revolution first occur in Britain rather than in another country?”

“Are protectionist infant industry policies’ usually successful in fostering growth?”

International

“Do floating (rather than fixed) exchange rates lead to macroeconomic instability?”

Environmental

“What is the appropriate discount rate to use for considering costly measures to reduce carbon emissions?”

Industrial Organization

“Do firms innovate more or less when they have greater market power in an industry?”

“Do ‘single payer’ health care plans like the NHS provide basic health care services more or less efficiently then policies of mandated insurance and regulated exchanges like in the Netherlands?”

A more extensive definition and discussion of fields is in Appendix A of “Writing Economics”

Do you know?…

Which type of analysis typically uses the most ‘difficult, formal’ maths? 7

  • Microeconomic theory
  • Applied econometric analysis
  • Descriptive macroeconomics

Another use of data: ‘calibrating’ models aka ‘calibration exercises’; I will not discuss this at the moment. ↩

Sometimes this can be confusing, particularly when the data seems to represent the entire ‘population’ of interest, such as an industry’s price and sales data in a relevant period. Without getting into an extensive discussion of the meaning of probability and statistics, I will suggest that we can see this as a ‘sample of the prices and sales that could have ocurred in any possible universe, or over a period of many years’. Ouch, this gets thorny, and there are strong debates in the Statistics world about this stuff. ↩

Answer: 1. Microeconomic theory ↩

Empirical Research and the Development of Economic Science

MARC RIS BibTeΧ

Download Citation Data

More from NBER

In addition to working papers , the NBER disseminates affiliates’ latest findings through a range of free periodicals — the NBER Reporter , the NBER Digest , the Bulletin on Retirement and Disability , the Bulletin on Health , and the Bulletin on Entrepreneurship  — as well as online conference reports , video lectures , and interviews .

2024, 16th Annual Feldstein Lecture, Cecilia E. Rouse," Lessons for Economists from the Pandemic" cover slide

  • Feldstein Lecture
  • Presenter: Cecilia E. Rouse

 2024 Methods Lecture, Susan Athey, "Analysis and Design of Multi-Armed Bandit Experiments and Policy Learning"

  • Methods Lectures
  • Presenter: Susan Athey

2024, Economics of Social Security Panel, "Earnings Inequality and Payroll Tax Revenues"

  • Panel Discussion
  • Presenters: Karen Dynan , Karen Glenn, Stephen Goss, Fatih Guvenen & James Pearce
  • Architecture and Design
  • Asian and Pacific Studies
  • Business and Economics
  • Classical and Ancient Near Eastern Studies
  • Computer Sciences
  • Cultural Studies
  • Engineering
  • General Interest
  • Geosciences
  • Industrial Chemistry
  • Islamic and Middle Eastern Studies
  • Jewish Studies
  • Library and Information Science, Book Studies
  • Life Sciences
  • Linguistics and Semiotics
  • Literary Studies
  • Materials Sciences
  • Mathematics
  • Social Sciences
  • Sports and Recreation
  • Theology and Religion
  • Publish your article
  • The role of authors
  • Promoting your article
  • Abstracting & indexing
  • Publishing Ethics
  • Why publish with De Gruyter
  • How to publish with De Gruyter
  • Our book series
  • Our subject areas
  • Your digital product at De Gruyter
  • Contribute to our reference works
  • Product information
  • Tools & resources
  • Product Information
  • Promotional Materials
  • Orders and Inquiries
  • FAQ for Library Suppliers and Book Sellers
  • Repository Policy
  • Free access policy
  • Open Access agreements
  • Database portals
  • For Authors
  • Customer service
  • People + Culture
  • Journal Management
  • How to join us
  • Working at De Gruyter
  • Mission & Vision
  • De Gruyter Foundation
  • De Gruyter Ebound
  • Our Responsibility
  • Partner publishers

what is empirical research economics

Your purchase has been completed. Your documents are now available to view.

Methods Used in Economic Research: An Empirical Study of Trends and Levels

The methods used in economic research are analyzed on a sample of all 3,415 regular research papers published in 10 general interest journals every 5th year from 1997 to 2017. The papers are classified into three main groups by method: theory, experiments, and empirics. The theory and empirics groups are almost equally large. Most empiric papers use the classical method, which derives an operational model from theory and runs regressions. The number of papers published increases by 3.3% p.a. Two trends are highly significant: The fraction of theoretical papers has fallen by 26 pp (percentage points), while the fraction of papers using the classical method has increased by 15 pp. Economic theory predicts that such papers exaggerate, and the papers that have been analyzed by meta-analysis confirm the prediction. It is discussed if other methods have smaller problems.

1 Introduction

This paper studies the pattern in the research methods in economics by a sample of 3,415 regular papers published in the years 1997, 2002, 2007, 2012, and 2017 in 10 journals. The analysis builds on the beliefs that truth exists, but it is difficult to find, and that all the methods listed in the next paragraph have problems as discussed in Sections 2 and 4. Hereby I do not imply that all – or even most – papers have these problems, but we rarely know how serious it is when we read a paper. A key aspect of the problem is that a “perfect” study is very demanding and requires far too much space to report, especially if the paper looks for usable results. Thus, each paper is just one look at an aspect of the problem analyzed. Only when many studies using different methods reach a joint finding, we can trust that it is true.

Section 2 discusses the classification of papers by method into three main categories: (M1) Theory , with three subgroups: (M1.1) economic theory, (M1.2) statistical methods, and (M1.3) surveys. (M2) Experiments , with two subgroups: (M2.1) lab experiments and (M2.2) natural experiments. (M3) Empirics , with three subgroups: (M3.1) descriptive, (M3.2) classical empirics, and (M3.3) newer empirics. More than 90% of the papers are easy to classify, but a stochastic element enters in the classification of the rest. Thus, the study has some – hopefully random – measurement errors.

Section 3 discusses the sample of journals chosen. The choice has been limited by the following main criteria: It should be good journals below the top ten A-journals, i.e., my article covers B-journals, which are the journals where most research economists publish. It should be general interest journals, and the journals should be so different that it is likely that patterns that generalize across these journals apply to more (most?) journals. The Appendix gives some crude counts of researchers, departments, and journals. It assesses that there are about 150 B-level journals, but less than half meet the criteria, so I have selected about 15% of the possible ones. This is the most problematic element in the study. If the reader accepts my choice, the paper tells an interesting story about economic research.

All B-level journals try hard to have a serious refereeing process. If our selection is representative, the 150 journals have increased the annual number of papers published from about 7,500 in 1997 to about 14,000 papers in 2017, giving about 200,000 papers for the period. Thus, the B-level dominates our science. Our sample is about 6% for the years covered, but less than 2% of all papers published in B-journals in the period. However, it is a larger fraction of the papers in general interest journals.

It is impossible for anyone to read more than a small fraction of this flood of papers. Consequently, researchers compete for space in journals and for attention from the readers, as measured in the form of citations. It should be uncontroversial that papers that hold a clear message are easier to publish and get more citations. Thus, an element of sales promotion may enter papers in the form of exaggeration , which is a joint problem for all eight methods. This is in accordance with economic theory that predicts that rational researchers report exaggerated results; see Paldam ( 2016 , 2018 ). For empirical papers, meta-methods exist to summarize the results from many papers, notably papers using regressions. Section 4.4 reports that meta-studies find that exaggeration is common.

The empirical literature surveying the use of research methods is quite small, as I have found two articles only: Hamermesh ( 2013 ) covers 748 articles in 6 years a decade apart studies in three A-journals using a slightly different classification of methods, [1] while my study covers B-journals. Angrist, Azoulay, Ellison, Hill, and Lu ( 2017 ) use a machine-learning classification of 134,000 papers in 80 journals to look at the three main methods. My study subdivide the three categories into eight. The machine-learning algorithm is only sketched, so the paper is difficult to replicate, but it is surely a major effort. A key result in both articles is the strong decrease of theory in economic publications. This finding is confirmed, and it is shown that the corresponding increase in empirical articles is concentrated on the classical method.

I have tried to explain what I have done, so that everything is easy to replicate, in full or for one journal or one year. The coding of each article is available at least for the next five years. I should add that I have been in economic research for half a century. Some of the assessments in the paper will reflect my observations/experience during this period (indicated as my assessments). This especially applies to the judgements expressed in Section 4.

2 The eight categories

Table 1 reports that the annual number of papers in the ten journals has increased 1.9 times, or by 3.3% per year. The Appendix gives the full counts per category, journal, and year. By looking at data over two decades, I study how economic research develops. The increase in the production of papers is caused by two factors: The increase in the number of researchers. The increasing importance of publications for the careers of researchers.

The 3,415 papers

Year Papers Fraction Annual increase
From To In%
1997 464 13.6 1997 2002 2.2
2002 518 15.2 2002 2007 4.0
2007 661 19.4 2007 2012 4.6
2012 881 25.8 2012 2017 0.2
2017 891 26.1
Sum 3,415 100 1997 2017 3.3

2.1 (M1) Theory: subgroups (M1.1) to (M1.3)

Table 2 lists the groups and main numbers discussed in the rest of the paper. Section 2.1 discusses (M1) theory. Section 2.2 covers (M2) experimental methods, while Section 2.3 looks at (M3) empirical methods using statistical inference from data.

The 3,415 papers – fractions in percent

Three main groups Fraction Eight subgroups Fraction
(M1) Theory 49.6 (M1.1) Economic theory 45.2
(M1.2) Statistical technique, incl. forecasting 2.5
(M1.3) Surveys, incl. meta-studies 2.0
(M2) Experimental 6.4 (M2.1) Experiments in laboratories 5.7
(M2.2) Events, incl. real life experiments 0.7
(M3) Data inference 43.7 (M3.1) Descriptive, deductions from data 10.7
(M3.2) Classical empirical studies 28.5
(M3.3) Newer techniques 4.5

The change of the fractions from 1997 to 2017 in percentage points

Three main groups Change Eight subgroups Change
(M1) Theory −24.7 (M1.1) Economic theory −25.9
(M1.2) Statistical technique, incl. forecasting 2.2
(M1.3) Surveys, incl. meta-studies −1.0
(M2) Experimental 9.0 (M2.1) Experiments in laboratories 7.7
(M2.2) Events, incl. real life experiments 1.3
(M3) Data inference 15.8 (M3.1) Descriptive, deductions from data 2.4
(M3.2) Classical empirical studies 15.0
(M3.3) Newer techniques −1.7

Note: Section 3.4 tests if the pattern observed in Table 3 is statistically significant. The Appendix reports the full data.

2.1.1 (M1.1) Economic theory

Papers are where the main content is the development of a theoretical model. The ideal theory paper presents a (simple) new model that recasts the way we look at something important. Such papers are rare and obtain large numbers of citations. Most theoretical papers present variants of known models and obtain few citations.

In a few papers, the analysis is verbal, but more than 95% rely on mathematics, though the technical level differs. Theory papers may start by a descriptive introduction giving the stylized fact the model explains, but the bulk of the paper is the formal analysis, building a model and deriving proofs of some propositions from the model. It is often demonstrated how the model works by a set of simulations, including a calibration made to look realistic. However, the calibrations differ greatly by the efforts made to reach realism. Often, the simulations are in lieu of an analytical solution or just an illustration suggesting the magnitudes of the results reached.

Theoretical papers suffer from the problem known as T-hacking , [2] where the able author by a careful selection of assumptions can tailor the theory to give the results desired. Thus, the proofs made from the model may represent the ability and preferences of the researcher rather than the properties of the economy.

2.1.2 (M1.2) Statistical method

Papers reporting new estimators and tests are published in a handful of specialized journals in econometrics and mathematical statistics – such journals are not included. In our general interest journals, some papers compare estimators on actual data sets. If the demonstration of a methodological improvement is the main feature of the paper, it belongs to (M1.2), but if the economic interpretation is the main point of the paper, it belongs to (M3.2) or (M3.3). [3]

Some papers, including a special issue of Empirical Economics (vol. 53–1), deal with forecasting models. Such models normally have a weak relation to economic theory. They are sometimes justified precisely because of their eclectic nature. They are classified as either (M1.2) or (M3.1), depending upon the focus. It appears that different methods work better on different data sets, and perhaps a trade-off exists between the user-friendliness of the model and the improvement reached.

2.1.3 (M1.3) Surveys

When the literature in a certain field becomes substantial, it normally presents a motley picture with an amazing variation, especially when different schools exist in the field. Thus, a survey is needed, and our sample contains 68 survey articles. They are of two types, where the second type is still rare:

2.1.3.1 (M1.3.1) Assessed surveys

Here, the author reads the papers and assesses what the most reliable results are. Such assessments require judgement that is often quite difficult to distinguish from priors, even for the author of the survey.

2.1.3.2 (M1.3.2) Meta-studies

They are quantitative surveys of estimates of parameters claimed to be the same. Over the two decades from 1997 to 2017, about 500 meta-studies have been made in economics. Our sample includes five, which is 0.15%. [4] Meta-analysis has two levels: The basic level collects and codes the estimates and studies their distribution. This is a rather objective exercise where results seem to replicate rather well. [5] The second level analyzes the variation between the results. This is less objective. The papers analyzed by meta-studies are empirical studies using method (M3.2), though a few use estimates from (M3.1) and (M3.3).

2.2 (M2) Experimental methods: subgroups (M2.1) and (M2.2)

Experiments are of three distinct types, where the last two are rare, so they are lumped together. They are taking place in real life.

2.2.1 (M2.1) Lab experiments

The sample had 1.9% papers using this method in 1997, and it has expanded to 9.7% in 2017. It is a technique that is much easier to apply to micro- than to macroeconomics, so it has spread unequally in the 10 journals, and many experiments are reported in a couple of special journals that are not included in our sample.

Most of these experiments take place in a laboratory, where the subjects communicate with a computer, giving a controlled, but artificial, environment. [6] A number of subjects are told a (more or less abstract) story and paid to react in either of a number of possible ways. A great deal of ingenuity has gone into the construction of such experiments and in the methods used to analyze the results. Lab experiments do allow studies of behavior that are hard to analyze in any other way, and they frequently show sides of human behavior that are difficult to rationalize by economic theory. It appears that such demonstration is a strong argument for the publication of a study.

However, everything is artificial – even the payment. In some cases, the stories told are so elaborate and abstract that framing must be a substantial risk; [7] see Levitt and List ( 2007 ) for a lucid summary, and Bergh and Wichardt ( 2018 ) for a striking example. In addition, experiments cost money, which limits the number of subjects. It is also worth pointing to the difference between expressive and real behavior. It is typically much cheaper for the subject to “express” nice behavior in a lab than to be nice in the real world.

(M2.2) Event studies are studies of real world experiments. They are of two types:

(M2.2.1) Field experiments analyze cases where some people get a certain treatment and others do not. The “gold standard” for such experiments is double blind random sampling, where everything (but the result!) is preannounced; see Christensen and Miguel ( 2018 ). Experiments with humans require permission from the relevant authorities, and the experiment takes time too. In the process, things may happen that compromise the strict rules of the standard. [8] Controlled experiments are expensive, as they require a team of researchers. Our sample of papers contains no study that fulfills the gold standard requirements, but there are a few less stringent studies of real life experiments.

(M2.2.2) Natural experiments take advantage of a discontinuity in the environment, i.e., the period before and after an (unpredicted) change of a law, an earthquake, etc. Methods have been developed to find the effect of the discontinuity. Often, such studies look like (M3.2) classical studies with many controls that may or may not belong. Thus, the problems discussed under (M3.2) will also apply.

2.3 (M3) Empirical methods: subgroups (M3.1) to (M3.3)

The remaining methods are studies making inference from “real” data, which are data samples where the researcher chooses the sample, but has no control over the data generating process.

(M3.1) Descriptive studies are deductive. The researcher describes the data aiming at finding structures that tell a story, which can be interpreted. The findings may call for a formal test. If one clean test follows from the description, [9] the paper is classified under (M3.1). If a more elaborate regression analysis is used, it is classified as (M3.2). Descriptive studies often contain a great deal of theory.

Some descriptive studies present a new data set developed by the author to analyze a debated issue. In these cases, it is often possible to make a clean test, so to the extent that biases sneak in, they are hidden in the details of the assessments made when the data are compiled.

(M3.2) Classical empirics has three steps: It starts by a theory, which is developed into an operational model. Then it presents the data set, and finally it runs regressions.

The significance levels of the t -ratios on the coefficient estimated assume that the regression is the first meeting of the estimation model and the data. We all know that this is rarely the case; see also point (m1) in Section 4.4. In practice, the classical method is often just a presentation technique. The great virtue of the method is that it can be applied to real problems outside academia. The relevance comes with a price: The method is quite flexible as many choices have to be made, and they often give different results. Preferences and interests, as discussed in Sections 4.3 and 4.4 below, notably as point (m2), may affect these choices.

(M3.3) Newer empirics . Partly as a reaction to the problems of (M3.2), the last 3–4 decades have seen a whole set of newer empirical techniques. [10] They include different types of VARs, Bayesian techniques, causality/co-integration tests, Kalman Filters, hazard functions, etc. I have found 162 (or 4.7%) papers where these techniques are the main ones used. The fraction was highest in 1997. Since then it has varied, but with no trend.

I think that the main reason for the lack of success for the new empirics is that it is quite bulky to report a careful set of co-integration tests or VARs, and they often show results that are far from useful in the sense that they are unclear and difficult to interpret. With some introduction and discussion, there is not much space left in the article. Therefore, we are dealing with a cookbook that makes for rather dull dishes, which are difficult to sell in the market.

Note the contrast between (M3.2) and (M3.3): (M3.2) makes it possible to write papers that are too good, while (M3.3) often makes them too dull. This contributes to explain why (M3.2) is getting (even) more popular and the lack of success of (M3.3), but then, it is arguable that it is more dangerous to act on exaggerated results than on results that are weak.

3 The 10 journals

The 10 journals chosen are: (J1) Can [Canadian Journal of Economics], (J2) Emp [Empirical Economics], (J3) EER [European Economic Review], (J4) EJPE [European Journal of Political Economy], (J5) JEBO [Journal of Economic Behavior & Organization], (J6) Inter [Journal of International Economics], (J7) Macro [Journal of Macroeconomics], (J8) Kyklos, (J9) PuCh [Public Choice], and (J10) SJE [Scandinavian Journal of Economics].

Section 3.1 discusses the choice of journals, while Section 3.2 considers how journals deal with the pressure for publication. Section 3.3 shows the marked difference in publication profile of the journals, and Section 3.4 tests if the trends in methods are significant.

3.1 The selection of journals

They should be general interest journals – methodological journals are excluded. By general interest, I mean that they bring papers where an executive summary may interest policymakers and people in general. (ii) They should be journals in English (the Canadian Journal includes one paper in French), which are open to researchers from all countries, so that the majority of the authors are from outside the country of the journal. [11] (iii) They should be sufficiently different so that it is likely that patterns, which apply to these journals, tell a believable story about economic research. Note that (i) and (iii) require some compromises, as is evident in the choice of (J2), (J6), (J7), and (J8) ( Table 4 ).

The 10 journals covered

Journal Volume number Regular research papers published Growth
Code Name 1997 2002 2007 2012 2017 1997 2002 2007 2012 2017 All % p.a.
(J1) Can 30 35 40 45 50 68 43 55 66 46 278 −1.9
(J2) Emp 22 27 32–43 42–3 52–3 33 36 48 104 139 360 7.5
(J3) EER 41 46 51 56 91–100 56 91 89 106 140 482 4.7
(J4) EJPE 13 18 23 28 46–50 42 40 68 47 49 246 0.8
(J5) JEBO 32 47–9 62–4 82–4 133–44 41 85 101 207 229 663 9.0
(J6) Inter 42 56–8 71–3 86–8 104–9 45 59 66 87 93 350 3.7
(J7) Macro 19 24 29 34 51–4 44 25 51 79 65 264 2.0
(J8) Kyklos 50 55 60 65 70 21 22 30 29 24 126 0.7
(J9) PuCh 90–3 110–3 130–3 150–3 170–3 83 87 114 99 67 450 −1.1
(J10) SJE 99 104 109 114 119 31 30 39 57 39 196 1.2
All 464 518 661 881 891 3,415 3.3

Note. Growth is the average annual growth from 1997 to 2017 in the number of papers published.

Methodological journals are excluded, as they are not interesting to outsiders. However, new methods are developed to be used in general interest journals. From studies of citations, we know that useful methodological papers are highly cited. If they remain unused, we presume that it is because they are useless, though, of course, there may be a long lag.

The choice of journals may contain some subjectivity, but I think that they are sufficiently diverse so that patterns that generalize across these journals will also generalize across a broader range of good journals.

The papers included are the regular research articles. Consequently, I exclude short notes to other papers and book reviews, [12] except for a few article-long discussions of controversial books.

3.2 Creating space in journals

As mentioned in the introduction, the annual production of research papers in economics has now reached about 1,000 papers in top journals, and about 14,000 papers in the group of good journals. [13] The production has grown with 3.3% per year, and thus it has doubled the last twenty years. The hard-working researcher will read less than 100 papers a year. I know of no signs that this number is increasing. Thus, the upward trend in publication must be due to the large increase in the importance of publications for the careers of researchers, which has greatly increased the production of papers. There has also been a large increase in the number of researches, but as citations are increasingly skewed toward the top journals (see Heckman & Moktan, 2018 ), it has not increased demand for papers correspondingly. The pressures from the supply side have caused journals to look for ways to create space.

Book reviews have dropped to less than 1/3. Perhaps, it also indicates that economists read fewer books than they used to. Journals have increasingly come to use smaller fonts and larger pages, allowing more words per page. The journals from North-Holland Elsevier have managed to cram almost two old pages into one new one. [14] This makes it easier to publish papers, while they become harder to read.

Many journals have changed their numbering system for the annual issues, making it less transparent how much they publish. Only three – Canadian Economic Journal, Kyklos, and Scandinavian Journal of Economics – have kept the schedule of publishing one volume of four issues per year. It gives about 40 papers per year. Public Choice has a (fairly) consistent system with four volumes of two double issues per year – this gives about 100 papers. The remaining journals have changed their numbering system and increased the number of papers published per year – often dramatically.

Thus, I assess the wave of publications is caused by the increased supply of papers and not to the demand for reading material. Consequently, the study confirms and updates the observation by Temple ( 1918 , p. 242): “… as the world gets older the more people are inclined to write but the less they are inclined to read.”

3.3 How different are the journals?

The appendix reports the counts for each year and journal of the research methods. From these counts, a set of χ 2 -scores is calculated for the three main groups of methods – they are reported in Table 5 . It gives the χ 2 -test comparing the profile of each journal to the one of the other nine journals taken to be the theoretical distribution.

The methodological profile of the journals –  χ 2 -scores for main groups

Journal (M1) (M2) (M3) Sum -value
Code Name Theory Experiment Empirical (3)-test (%)
(J1) Can 7.4(+) 15.3(−) 1.7(−) 24.4 0.00
(J2) Emp 47.4(−) 16.0(−) 89.5(+) 152.9 0.00
(J3) EER 17.8(+) 0.3(−) 16.5(−) 34.4 0.00
(J4) EJPE 0.1(+) 11.2(−) 1.0(+) 12.2 0.31
(J5) JEBO 1.6(−) 1357.7(+) 41.1(−) 1404.4 0.00
(J6) Inter 2.4(+) 24.8(−) 0.1(+) 27.3 0.00
(J7) Macro 0.1(+) 18.2(−) 1.7(+) 20.0 0.01
(J8) Kyklos 20.1(−) 3.3(−) 31.2(+) 54.6 0.00
(J9) PuCh 0.0(+) 11.7(−) 2.2(+) 13.9 0.14
(J10) SJE 10.5(+) 1.8(−) 8.2(−) 20.4 0.01

Note: The χ 2 -scores are calculated relative to all other journals. The sign (+) or (−) indicates if the journal has too many or too few papers relatively in the category. The P -values for the χ 2 (3)-test always reject that the journal has the same methodological profile as the other nine journals.

The test rejects that the distribution is the same as the average for any of the journals. The closest to the average is the EJPE and Public Choice. The two most deviating scores are for the most micro-oriented journal JEBO, which brings many experimental papers, and of course, Empirical Economics, which brings many empirical papers.

3.4 Trends in the use of the methods

Table 3 already gave an impression of the main trends in the methods preferred by economists. I now test if these impressions are statistically significant. The tests have to be tailored to disregard three differences between the journals: their methodological profiles, the number of papers they publish, and the trend in the number. Table 6 reports a set of distribution free tests, which overcome these differences. The tests are done on the shares of each research method for each journal. As the data cover five years, it gives 10 pairs of years to compare. [15] The three trend-scores in the []-brackets count how often the shares go up, down, or stay the same in the 10 cases. This is the count done for a Kendall rank correlation comparing the five shares with a positive trend (such as 1, 2, 3, 4, and 5).

Trend-scores and tests for the eight subgroups of methods across the 10 journals

Journal (M1.1) (M1.2) (M1.3) (M2.1) (M2.2) (M3.1) (M3.2) (M3.3)
Code Name Theory Stat met Survey Exp. Event Descript. Classical Newer
(J1) Can [6, 3, 1] [6, 3, 1] [3, 1, 6] [3, 1, 6] [6, 4, 0] [8, 2, 0] [5, 4, 1]
(J2) Emp [2, 8, 0] [6, 4, 0] [0, 7, 3] [0, 4, 6] [3, 4, 3] [6, 4, 0] [8, 2, 0] [4, 6, 0]
(J3) EER [3, 7, 0] [4, 0, 6] [3, 1, 6] [7, 3, 0] [8, 2, 0] [3, 7, 0]
(J4) EJPE [0, 0, 10] [4, 0, 6] [4, 0, 6] [4, 6, 0] [8, 1, 0]
(J5) JEBO [2, 8, 0] [6, 1, 3] [6, 3, 1] [7, 3, 0] [6, 1, 3] [4, 6, 0] [8, 2, 0] [2, 4, 3]
(J6) Inter [0, 0, 10] [0, 0, 10] [0, 0, 10] [0, 0, 10] [8, 2, 0] [8, 2, 0] [4, 6, 0]
(J7) Macro [6, 4, 0] [5, 5, 0] [7, 2, 1] [0, 0, 10] [0, 0, 10] [3, 7, 0]
(J8) Kyklos [2, 8, 0] [0, 0, 10] [2, 2, 6] [2, 7, 1] [0, 0, 10] [4, 6, 0] [2, 2, 6]
(J9) PuCh [3, 7, 0] [4, 3, 3] [6, 3, 1] [4, 3, 3] [0, 0, 10] [5, 5, 0] [6, 4, 0] [6, 3, 1]
(J10) SJE [4, 0, 6] [6, 3, 1] [1, 3, 6] [3, 1, 6] [6, 4, 0] [6, 4, 0] [6, 1, 1]
All 100 per col. [22, 78, 0] [35, 16, 49] [35, 41, 24] [30, 22, 48] [22, 8, 70] [59, 41, 0] [73, 27, 0] [42, 43, 13]
Binominal test 56% 33% 8.86% 100%

Note: The three trend-scores in each [ I 1 , I 2 , I 3 ]-bracket are a Kendall-count over all 10 combinations of years. I 1 counts how often the share goes up. I 2 counts when the share goes down, and I 3 counts the number of ties. Most ties occur when there are no observations either year. Thus, I 1 + I 2 + I 3 = 10. The tests are two-sided binominal tests disregarding the zeroes. The test results in bold are significant at the 5% level.

The first set of trend-scores for (M1.1) and (J1) is [1, 9, 0]. It means that 1 of the 10 share-pairs increases, while nine decrease and no ties are found. The two-sided binominal test is 2%, so it is unlikely to happen. Nine of the ten journals in the (M1.1)-column have a majority of falling shares. The important point is that the counts in one column can be added – as is done in the all-row; this gives a powerful trend test that disregards differences between journals and the number of papers published. ( Table A1 )

Four of the trend-tests are significant: The fall in theoretical papers and the rise in classical papers. There is also a rise in the share of stat method and event studies. It is surprising that there is no trend in the number of experimental studies, but see Table A2 (in Appendix).

4 An attempt to interpret the pattern found

The development in the methods pursued by researchers in economics is a reaction to the demand and supply forces on the market for economic papers. As already argued, it seems that a key factor is the increasing production of papers.

The shares add to 100, so the decline of one method means that the others rise. Section 4.1 looks at the biggest change – the reduction in theory papers. Section 4.2 discusses the rise in two new categories. Section 4.3 considers the large increase in the classical method, while Section 4.4 looks at what we know about that method from meta-analysis.

4.1 The decline of theory: economics suffers from theory fatigue [16]

The papers in economic theory have dropped from 59.5 to 33.6% – this is the largest change for any of the eight subgroups. [17] It is highly significant in the trend test. I attribute this drop to theory fatigue.

As mentioned in Section 2.1, the ideal theory paper presents a (simple) new model that recasts the way we look at something important. However, most theory papers are less exciting: They start from the standard model and argue that a well-known conclusion reached from the model hinges upon a debatable assumption – if it changes, so does the conclusion. Such papers are useful. From a literature on one main model, the profession learns its strengths and weaknesses. It appears that no generally accepted method exists to summarize this knowledge in a systematic way, though many thoughtful summaries have appeared.

I think that there is a deeper problem explaining theory fatigue. It is that many theoretical papers are quite unconvincing. Granted that the calculations are done right, believability hinges on the realism of the assumptions at the start and of the results presented at the end. In order for a model to convince, it should (at least) demonstrate the realism of either the assumptions or the outcome. [18] If both ends appear to hang in the air, it becomes a game giving little new knowledge about the world, however skillfully played.

The theory fatigue has caused a demand for simulations demonstrating that the models can mimic something in the world. Kydland and Prescott pioneered calibration methods (see their 1991 ). Calibrations may be carefully done, but it often appears like a numerical solution of a model that is too complex to allow an analytical solution.

4.2 Two examples of waves: one that is still rising and another that is fizzling out

When a new method of gaining insights in the economy first appears, it is surrounded by doubts, but it also promises a high marginal productivity of knowledge. Gradually the doubts subside, and many researchers enter the field. After some time this will cause the marginal productivity of the method to fall, and it becomes less interesting. The eight methods include two newer ones: Lab experiments and newer stats. [19]

It is not surprising that papers with lab experiments are increasing, though it did take a long time: The seminal paper presenting the technique was Smith ( 1962 ), but only a handful of papers are from the 1960s. Charles Plott organized the first experimental lab 10 years later – this created a new standard for experiments, but required an investment in a lab and some staff. Labs became more common in the 1990s as PCs got cheaper and software was developed to handle experiments, but only 1.9% of the papers in the 10 journals reported lab experiments in 1997. This has now increased to 9.7%, so the wave is still rising. The trend in experiments is concentrated in a few journals, so the trend test in Table 6 is insignificant, but it is significant in the Appendix Table A2 , where it is done on the sum of articles irrespective of the journal.

In addition to the rising share of lab experiment papers in some journals, the journal Experimental Economics was started in 1998, where it published 281 pages in three issues. In 2017, it had reached 1,006 pages in four issues, [20] which is an annual increase of 6.5%.

Compared with the success of experimental economics, the motley category of newer empirics has had a more modest success, as the fraction of papers in the 5 years are 5.8, 5.2, 3.5, 5.4, and 4.2, which has no trend. Newer stats also require investment, but mainly in human capital. [21] Some of the papers using the classical methodology contain a table with Dickey-Fuller tests or some eigenvalues of the data matrix, but they are normally peripheral to the analysis. A couple of papers use Kalman filters, and a dozen papers use Bayesian VARs. However, it is clear that the newer empirics have made little headway into our sample of general interest journals.

4.3 The steady rise of the classical method: flexibility rewarded

The typical classical paper provides estimates of a key effect that decision-makers outside academia want to know. This makes the paper policy relevant right from the start, and in many cases, it is possible to write a one page executive summary to the said decision-makers.

The three-step convention (see Section 2.3) is often followed rather loosely. The estimation model is nearly always much simpler than the theory. Thus, while the model can be derived from a theory, the reverse does not apply. Sometimes, the model seems to follow straight from common sense, and if the link from the theory to the model is thin, it begs the question: Is the theory really necessary? In such cases, it is hard to be convinced that the tests “confirm” the theory, but then, of course, tests only say that the data do not reject the theory.

The classical method is often only a presentation devise. Think of a researcher who has reached a nice publishable result through a long and tortuous path, including some failed attempts to find such results. It is not possible to describe that path within the severely limited space of an article. In addition, such a presentation would be rather dull to read, and none of us likes to talk about wasted efforts that in hindsight seem a bit silly. Here, the classical method becomes a convenient presentation device.

The biggest source of variation in the results is the choice of control/modifier variables. All datasets presumably contain some general and some special information, where the latter depends on the circumstances prevailing when the data were compiled. The regression should be controlled for these circumstances in order to reach the general result. Such ceteris paribus controls are not part of the theory, so many possible controls may be added. The ones chosen for publication often appear to be the ones delivering the “right” results by the priors of the researcher. The justification for their inclusion is often thin, and if two-stage regressions are used, the first stage instruments often have an even thinner justification.

Thus, the classical method is rather malleable to the preferences and interests of researchers and sponsors. This means that some papers using the classical technique are not what they pretend, as already pointed out by Leamer ( 1983 ), see also Paldam ( 2018 ) for new references and theory. The fact that data mining is tempting suggests that it is often possible to reach smashing results, making the paper nice to read. This may be precisely why it is cited.

Many papers using the classical method throw in some bits of exotic statistics technique to demonstrate the robustness of the result and the ability of the researcher. This presumably helps to generate credibility.

4.4 Knowledge about classical papers reached from meta-studies

(m1) The range of the estimates is typically amazingly large, given the high -ratios reported. This confirms that -ratios are problematic as claimed in Section 2.3.
(m2) Publication biases (exaggerations) are common, i.e., meta-analyses routinely reject the null hypothesis of no publication bias. My own crude rule of thumb is that exaggeration is by a factor two – the two meta–meta studies cited give some support to this rule.
(m3) The meta-average estimated from all studies normally converges, and for > 30, the meta-average normally stabilizes to a well-defined value, see Doucouliagos et al. ( ).

Individual studies using the classical method often look better than they are, and thus they are more uncertain than they appear, but we may think of the value of convergence for large N s (number of observations) as the truth. The exaggeration is largest in the beginning of a new literature, but gradually it becomes smaller. Thus, the classical method does generate truth when the effect searched for has been studied from many sides. The word research does mean that the search has to be repeated! It is highly risky to trust a few papers only.

Meta-analysis has found other results such as: Results in top journals do not stand out. It is necessary to look at many journals, as many papers on the same effect are needed. Little of the large variation between results is due to the choice of estimators.

A similar development should occur also for experimental economics. Experiments fall in families: A large number cover prisoner’s dilemma games, but there are also many studies of dictator games, auction games, etc. Surveys summarizing what we have learned about these games seem highly needed. Assessed summaries of old experiments are common, notably in introductions to papers reporting new ones. It should be possible to extract the knowledge reached by sets of related lab experiments in a quantitative way, by some sort of meta-technique, but this has barely started. The first pioneering meta-studies of lab experiments do find the usual wide variation of results from seemingly closely related experiments. [25] A recent large-scale replicability study by Camerer et al. ( 2018 ) finds that published experiments in the high quality journal Nature and Science exaggerate by a factor two just like regression studies using the classical method.

5 Conclusion

The study presents evidence that over the last 20 years economic research has moved away from theory towards empirical work using the classical method.

From the eighties onward, there has been a steady stream of papers pointing out that the classical method suffers from excess flexibility. It does deliver relevant results, but they tend to be too good. [26] While, increasingly, we know the size of the problems of the classical method, systematic knowledge about the problems of the other methods is weaker. It is possible that the problems are smaller, but we do not know.

Therefore, it is clear that obtaining solid knowledge about the size of an important effect requires a great deal of papers analyzing many aspects of the effect and a careful quantitative survey. It is a well-known principle in the harder sciences that results need repeated independent replication to be truly trustworthy. In economics, this is only accepted in principle.

The classical method of empirical research is gradually winning, and this is a fine development: It does give answers to important policy questions. These answers are highly variable and often exaggerated, but through the efforts of many competing researchers, solid knowledge will gradually emerge.

Home page: http://www.martin.paldam.dk

Acknowledgments

The paper has been presented at the 2018 MAER-Net Colloquium in Melbourne, the Kiel Aarhus workshop in 2018, and at the European Public Choice 2019 Meeting in Jerusalem. I am grateful for all comments, especially from Chris Doucouliagos, Eelke de Jong, and Bob Reed. In addition, I thank the referees for constructive advice.

Conflict of interest: Author states no conflict of interest.

Appendix: Two tables and some assessments of the size of the profession

The text needs some numbers to assess the representativity of the results reached. These numbers just need to be orders of magnitude. I use the standard three-level classification in A, B, and C of researchers, departments, and journals. The connections between the three categories are dynamic and rely on complex sorting mechanisms. In an international setting, it matters that researchers have preferences for countries, notably their own. The relation between the three categories has a stochastic element.

The World of Learning organization reports on 36,000 universities, colleges, and other institutes of tertiary education and research. Many of these institutions are mainly engaged in undergraduate teaching, and some are quite modest. If half of these institutions have a program in economics, with a staff of at least five, the total stock of academic economists is 100,000, of which most are at the C-level.

The A-level of about 500 tenured researchers working at the top ten universities (mainly) publishes in the top 10 journals that bring less than 1,000 papers per year; [27] see Heckman and Moktan (2020). They (mainly) cite each other, but they greatly influence other researchers. [28] The B-level consists of about 15–20,000 researchers who work at 4–500 research universities, with graduate programs and ambitions to publish. They (mainly) publish in the next level of about 150 journals. [29] In addition, there are at least another 1,000 institutions that strive to move up in the hierarchy.

The counts for each of the 10 journals

Main group (M1) (M2) (M3)
Subgroup (M1.1) (M1.2) (M1.3) (M2.1) (M2.2) (M3.1) (M3.2) (M3.3)
Number papers Theory Stat. theory Surveys meta Experiments Event studies Descriptive Classical empiric Newer empiric
(J1) Can 68 47 2 10 8 1
(J2) Emp 33 11 5 1 7 3 6
(J3) EER 56 34 3 4 12 3
(J4) EJPE 42 29 2 5 6
(J5) JEBO 41 26 7 3 5
(J6) Inter 45 35 1 7 2
(J7) Macro 44 18 1 10 15
(J8) Kyklos 21 10 1 4 6
(J9) PuCh 83 40 7 1 1 8 26
(J10) SJE 31 26 1 4
(J1) Can 43 27 1 5 7 3
(J2) Emp 36 1 14 1 4 7 9
(J3) EER 91 63 4 3 4 17
(J4) EJPE 40 27 2 2 9
(J5) JEBO 85 52 3 14 10 5 1
(J6) Inter 59 40 4 9 6
(J7) Macro 25 8 2 1 6 8
(J8) Kyklos 22 6 1 2 13
(J9) PuCh 87 39 2 1 14 31
(J10) SJE 30 18 2 10
(J1) Can 55 26 4 6 17 2
(J2) Emp 48 4 8 3 23 10
(J3) EER 89 55 2 1 8 20 3
(J4) EJPE 68 36 2 9 20 1
(J5) JEBO 101 73 10 3 3 12
(J6) Inter 66 39 4 21 2
(J7) Macro 51 30 1 6 10 4
(J8) Kyklos 30 2 1 6 20 1
(J9) PuCh 114 53 4 19 38
(J10) SJE 39 29 1 1 2 6
(J1) Can 66 33 1 1 1 8 21 1
(J2) Emp 104 8 16 17 38 25
(J3) EER 106 56 7 1 7 33 2
(J4) EJPE 47 12 1 2 31 1
(J5) JEBO 207 75 2 9 50 17 52 2
(J6) Inter 87 36 17 33 1
(J7) Macro 79 32 2 3 12 14 16
(J8) Kyklos 29 8 2 19
(J9) PuCh 99 47 2 2 48
(J10) SJE 57 32 2 1 22
(J1) Can 46 20 1 5 9 9 2
(J2) Emp 139 1 25 4 30 60 19
(J3) EER 140 75 1 1 16 13 32 2
(J4) EJPE 49 14 2 1 4 27 1
(J5) JEBO 229 66 1 3 63 9 11 76
(J6) Inter 93 42 10 33 8
(J7) Macro 65 28 1 9 10 13 4
(J8) Kyklos 24 1 1 3 19
(J9) PuCh 67 33 1 3 10 20
(J10) SJE 39 19 1 1 1 4 12 1

Counts, shares, and changes for all ten journals for subgroups

Number (M1.1) (M1.2) (M1.3) (M2.1) (M2.2) (M3.1) (M3.2) (M3.3)
Year I: Sum of counts
1997 464 276 5 15 9 2 43 87 27
2002 518 281 19 11 21 0 45 114 27
2007 661 347 10 9 15 4 66 187 23
2012 881 339 21 13 62 3 106 289 48
2017 891 299 29 20 86 15 104 301 37
All years 3,415 1,542 84 68 193 24 364 978 162
Year II: Average fraction in per cent
1997 100 59.5 1.1 3.2 1.9 0.4 9.3 18.8 5.8
2002 100 54.2 3.7 2.1 4.1 8.7 22.0 5.2
2007 100 52.5 1.5 1.4 2.3 0.6 10.0 28.3 3.5
2012 100 38.5 2.4 1.5 7.0 0.3 12.0 32.8 5.4
2017 100 33.6 3.3 2.2 9.7 1.7 11.7 33.8 4.2
All years 100 45.2 2.5 2.0 5.7 0.7 10.7 28.6 4.7
Trends-scores [0, 10, 0] [7, 3, 0] [4, 6, 0] [9, 1, 0] [5, 5, 0] [8, 2, 0] [10, 0, 0] [3, 7, 0]
Binominal test 34 37 100 11 34
From To III: Change of fraction in percentage points
1997 2002 −5.2 2.6 −1.1 2.1 −0.4 −0.6 3.3 −0.6
2002 2007 −1.8 −2.2 −0.8 −1.8 0.6 1.3 6.3 −1.7
2007 2012 −14.0 0.9 0.1 4.8 −0.3 2.0 4.5 2.0
2012 2017 −4.9 0.9 0.8 2.6 1.3 −0.4 1.0 −1.3
1997 2017 −25.9 2.2 −1.0 7.7 1.3 2.4 15.0 −1.7

Note: The trend-scores are calculated as in Table 6 . Compared to the results in Table 6 , the results are similar, but the power is less than before. However, note that the results in Column (M2.1) dealing with experiments are stronger in Table A2 . This has to do with the way missing observations are treated in the test.

Angrist, J. , Azoulay, P. , Ellison, G. , Hill, R. , & Lu, S. F. (2017). Economic research evolves: Fields and styles. American Economic Review (Papers & Proceedings), 107, 293–297. 10.1257/aer.p20171117 Search in Google Scholar

Bergh, A. , & Wichardt, P. C. (2018). Mine, ours or yours? Unintended framing effects in dictator games (INF Working Papers, No 1205). Research Institute of Industrial Econ, Stockholm. München: CESifo. 10.2139/ssrn.3208589 Search in Google Scholar

Brodeur, A. , Cook, N. , & Heyes, A. (2020). Methods matter: p-Hacking and publication bias in causal analysis in economics. American Economic Review, 110(11), 3634–3660. 10.1257/aer.20190687 Search in Google Scholar

Camerer, C. F. , Dreber, A. , Holzmaster, F. , Ho, T.-H. , Huber, J. , Johannesson, M. , … Wu, H. (27 August 2018). Nature Human Behaviour. https://www.nature.com/articles/M2.11562-018-0399-z Search in Google Scholar

Card, D. , & DellaVigna, A. (2013). Nine facts about top journals in economics. Journal of Economic Literature, 51, 144–161 10.3386/w18665 Search in Google Scholar

Christensen, G. , & Miguel, E. (2018). Transparency, reproducibility, and the credibility of economics research. Journal of Economic Literature, 56, 920–980 10.3386/w22989 Search in Google Scholar

Doucouliagos, H. , Paldam, M. , & Stanley, T. D. (2018). Skating on thin evidence: Implications for public policy. European Journal of Political Economy, 54, 16–25 10.1016/j.ejpoleco.2018.03.004 Search in Google Scholar

Engel, C. (2011). Dictator games: A meta study. Experimental Economics, 14, 583–610 10.1007/s10683-011-9283-7 Search in Google Scholar

Fiala, L. , & Suentes, S. (2017). Transparency and cooperation in repeated dilemma games: A meta study. Experimental Economics, 20, 755–771 10.1007/s10683-017-9517-4 Search in Google Scholar

Friedman, M. (1953). Essays in positive economics. Chicago: University of Chicago Press. Search in Google Scholar

Hamermesh, D. (2013). Six decades of top economics publishing: Who and how? Journal of Economic Literature, 51, 162–172 10.3386/w18635 Search in Google Scholar

Heckman, J. J. , & Moktan, S. (2018). Publishing and promotion in economics: The tyranny of the top five. Journal of Economic Literature, 51, 419–470 10.3386/w25093 Search in Google Scholar

Ioannidis, J. P. A. , Stanley, T. D. , & Doucouliagos, H. (2017). The power of bias in economics research. Economic Journal, 127, F236–F265 10.1111/ecoj.12461 Search in Google Scholar

Johansen, S. , & Juselius, K. (1990). Maximum likelihood estimation and inference on cointegration – with application to the demand for money. Oxford Bulletin of Economics and Statistics, 52, 169–210 10.1111/j.1468-0084.1990.mp52002003.x Search in Google Scholar

Justman, M. (2018). Randomized controlled trials informing public policy: Lessons from the project STAR and class size reduction. European Journal of Political Economy, 54, 167–174 10.1016/j.ejpoleco.2018.04.005 Search in Google Scholar

Kydland, F. , & Prescott, E. C. (1991). The econometrics of the general equilibrium approach to business cycles. Scandinavian Journal of Economics, 93, 161–178 10.2307/3440324 Search in Google Scholar

Leamer, E. E. (1983). Let’s take the con out of econometrics. American Economic Review, 73, 31–43 Search in Google Scholar

Levitt, S. D. , & List, J. A. (2007). On the generalizability of lab behaviour to the field. Canadian Journal of Economics, 40, 347–370 10.1111/j.1365-2966.2007.00412.x Search in Google Scholar

Paldam, M. (April 14th 2015). Meta-analysis in a nutshell: Techniques and general findings. Economics. The Open-Access, Open-Assessment E-Journal, 9, 1–4 10.5018/economics-ejournal.ja.2015-11 Search in Google Scholar

Paldam, M. (2016). Simulating an empirical paper by the rational economist. Empirical Economics, 50, 1383–1407 10.1007/s00181-015-0971-6 Search in Google Scholar

Paldam, M. (2018). A model of the representative economist, as researcher and policy advisor. European Journal of Political Economy, 54, 6–15 10.1016/j.ejpoleco.2018.03.005 Search in Google Scholar

Smith, V. (1962). An experimental study of competitive market behavior. Journal of Political Economy, 70, 111–137 10.1017/CBO9780511528354.003 Search in Google Scholar

Stanley, T. D. , & Doucouliagos, H. (2012). Meta-regression analysis in economics and business. Abingdon: Routledge. 10.4324/9780203111710 Search in Google Scholar

Temple, C. L. (1918). Native races and their rulers; sketches and studies of official life and administrative problems in Nigeria. Cape Town: Argus Search in Google Scholar

© 2021 Martin Paldam, published by De Gruyter

This work is licensed under the Creative Commons Attribution 4.0 International License.

  • X / Twitter

Supplementary Materials

  • Supplementary material

Please login or register with De Gruyter to order this product.

Economics

Journal and Issue

Articles in the same issue.

what is empirical research economics

This website uses cookies.

By clicking the "Accept" button or continuing to browse our site, you agree to first-party and session-only cookies being stored on your device to enhance site navigation and analyze site performance and traffic. For more information on our use of cookies, please see our Privacy Policy .

  • Research Highlights

An empirical turn in economics research

Research Highlights Featured Chart

June 26, 2017

what is empirical research economics

A table of results in an issue of the American Economic Review.

Gian Romagnoli

Over the past few decades, economists have increasingly been cited in the press and sought by Congress to give testimony on the issues of the day. This could be due in part to the increasingly empirical nature of economics research.

Aided by internet connections that allow datasets to be assembled from disparate sources and cheap computing power to crunch the numbers, economists are more and more often turning to real-world data to complement and test theoretical models.

This trend was documented in a 2013 article from the Journal of Economic Literature that showed, in a sample of 748 academic journal articles in top economics journals, that empirical work has become much more common since the 1960s.

In the spirit of empirical inquiry, the authors of a study appearing in the May issue of the American Economic Review: Papers & Proceedings used machine learning techniques to expand this analysis to a much larger set of 135,000 papers published across 80 academic journals cited frequently in the American Economic Review .

what is empirical research economics

Figure 4  from Angrist et al. (2017)

Sorting hundreds of thousands of papers into “theoretical” and “empirical” piles by hand would be prohibitive, so authors Joshua Angrist , Pierre Azoulay , Glenn Ellison , Ryan Hill, and Susan Feng Lu use latent Dirichlet allocation and logistic ridge regression to analyze the wording of titles and abstracts and assign each paper to a category.

Based on a smaller group of five thousand papers classified by research assistants, the algorithm learns what keywords are associated with empirical work and theoretical work and then can quickly classifies thousands of other papers that weren’t reviewed directly by the researchers.

The figure above shows the prevalence of empirical work as determined by the authors’ model has been rising across fields since 1980. The authors note that the empirical turn is not a result of certain more empirical fields overtaking other more theoretical ones, but instead every field becoming more empirically-minded.

What is Empirical Research? Definition, Methods, Examples

Appinio Research · 09.02.2024 · 36min read

What is Empirical Research Definition Methods Examples

Ever wondered how we gather the facts, unveil hidden truths, and make informed decisions in a world filled with questions? Empirical research holds the key.

In this guide, we'll delve deep into the art and science of empirical research, unraveling its methods, mysteries, and manifold applications. From defining the core principles to mastering data analysis and reporting findings, we're here to equip you with the knowledge and tools to navigate the empirical landscape.

What is Empirical Research?

Empirical research is the cornerstone of scientific inquiry, providing a systematic and structured approach to investigating the world around us. It is the process of gathering and analyzing empirical or observable data to test hypotheses, answer research questions, or gain insights into various phenomena. This form of research relies on evidence derived from direct observation or experimentation, allowing researchers to draw conclusions based on real-world data rather than purely theoretical or speculative reasoning.

Characteristics of Empirical Research

Empirical research is characterized by several key features:

  • Observation and Measurement : It involves the systematic observation or measurement of variables, events, or behaviors.
  • Data Collection : Researchers collect data through various methods, such as surveys, experiments, observations, or interviews.
  • Testable Hypotheses : Empirical research often starts with testable hypotheses that are evaluated using collected data.
  • Quantitative or Qualitative Data : Data can be quantitative (numerical) or qualitative (non-numerical), depending on the research design.
  • Statistical Analysis : Quantitative data often undergo statistical analysis to determine patterns , relationships, or significance.
  • Objectivity and Replicability : Empirical research strives for objectivity, minimizing researcher bias . It should be replicable, allowing other researchers to conduct the same study to verify results.
  • Conclusions and Generalizations : Empirical research generates findings based on data and aims to make generalizations about larger populations or phenomena.

Importance of Empirical Research

Empirical research plays a pivotal role in advancing knowledge across various disciplines. Its importance extends to academia, industry, and society as a whole. Here are several reasons why empirical research is essential:

  • Evidence-Based Knowledge : Empirical research provides a solid foundation of evidence-based knowledge. It enables us to test hypotheses, confirm or refute theories, and build a robust understanding of the world.
  • Scientific Progress : In the scientific community, empirical research fuels progress by expanding the boundaries of existing knowledge. It contributes to the development of theories and the formulation of new research questions.
  • Problem Solving : Empirical research is instrumental in addressing real-world problems and challenges. It offers insights and data-driven solutions to complex issues in fields like healthcare, economics, and environmental science.
  • Informed Decision-Making : In policymaking, business, and healthcare, empirical research informs decision-makers by providing data-driven insights. It guides strategies, investments, and policies for optimal outcomes.
  • Quality Assurance : Empirical research is essential for quality assurance and validation in various industries, including pharmaceuticals, manufacturing, and technology. It ensures that products and processes meet established standards.
  • Continuous Improvement : Businesses and organizations use empirical research to evaluate performance, customer satisfaction , and product effectiveness. This data-driven approach fosters continuous improvement and innovation.
  • Human Advancement : Empirical research in fields like medicine and psychology contributes to the betterment of human health and well-being. It leads to medical breakthroughs, improved therapies, and enhanced psychological interventions.
  • Critical Thinking and Problem Solving : Engaging in empirical research fosters critical thinking skills, problem-solving abilities, and a deep appreciation for evidence-based decision-making.

Empirical research empowers us to explore, understand, and improve the world around us. It forms the bedrock of scientific inquiry and drives progress in countless domains, shaping our understanding of both the natural and social sciences.

How to Conduct Empirical Research?

So, you've decided to dive into the world of empirical research. Let's begin by exploring the crucial steps involved in getting started with your research project.

1. Select a Research Topic

Selecting the right research topic is the cornerstone of a successful empirical study. It's essential to choose a topic that not only piques your interest but also aligns with your research goals and objectives. Here's how to go about it:

  • Identify Your Interests : Start by reflecting on your passions and interests. What topics fascinate you the most? Your enthusiasm will be your driving force throughout the research process.
  • Brainstorm Ideas : Engage in brainstorming sessions to generate potential research topics. Consider the questions you've always wanted to answer or the issues that intrigue you.
  • Relevance and Significance : Assess the relevance and significance of your chosen topic. Does it contribute to existing knowledge? Is it a pressing issue in your field of study or the broader community?
  • Feasibility : Evaluate the feasibility of your research topic. Do you have access to the necessary resources, data, and participants (if applicable)?

2. Formulate Research Questions

Once you've narrowed down your research topic, the next step is to formulate clear and precise research questions . These questions will guide your entire research process and shape your study's direction. To create effective research questions:

  • Specificity : Ensure that your research questions are specific and focused. Vague or overly broad questions can lead to inconclusive results.
  • Relevance : Your research questions should directly relate to your chosen topic. They should address gaps in knowledge or contribute to solving a particular problem.
  • Testability : Ensure that your questions are testable through empirical methods. You should be able to gather data and analyze it to answer these questions.
  • Avoid Bias : Craft your questions in a way that avoids leading or biased language. Maintain neutrality to uphold the integrity of your research.

3. Review Existing Literature

Before you embark on your empirical research journey, it's essential to immerse yourself in the existing body of literature related to your chosen topic. This step, often referred to as a literature review, serves several purposes:

  • Contextualization : Understand the historical context and current state of research in your field. What have previous studies found, and what questions remain unanswered?
  • Identifying Gaps : Identify gaps or areas where existing research falls short. These gaps will help you formulate meaningful research questions and hypotheses.
  • Theory Development : If your study is theoretical, consider how existing theories apply to your topic. If it's empirical, understand how previous studies have approached data collection and analysis.
  • Methodological Insights : Learn from the methodologies employed in previous research. What methods were successful, and what challenges did researchers face?

4. Define Variables

Variables are fundamental components of empirical research. They are the factors or characteristics that can change or be manipulated during your study. Properly defining and categorizing variables is crucial for the clarity and validity of your research. Here's what you need to know:

  • Independent Variables : These are the variables that you, as the researcher, manipulate or control. They are the "cause" in cause-and-effect relationships.
  • Dependent Variables : Dependent variables are the outcomes or responses that you measure or observe. They are the "effect" influenced by changes in independent variables.
  • Operational Definitions : To ensure consistency and clarity, provide operational definitions for your variables. Specify how you will measure or manipulate each variable.
  • Control Variables : In some studies, controlling for other variables that may influence your dependent variable is essential. These are known as control variables.

Understanding these foundational aspects of empirical research will set a solid foundation for the rest of your journey. Now that you've grasped the essentials of getting started, let's delve deeper into the intricacies of research design.

Empirical Research Design

Now that you've selected your research topic, formulated research questions, and defined your variables, it's time to delve into the heart of your empirical research journey – research design . This pivotal step determines how you will collect data and what methods you'll employ to answer your research questions. Let's explore the various facets of research design in detail.

Types of Empirical Research

Empirical research can take on several forms, each with its own unique approach and methodologies. Understanding the different types of empirical research will help you choose the most suitable design for your study. Here are some common types:

  • Experimental Research : In this type, researchers manipulate one or more independent variables to observe their impact on dependent variables. It's highly controlled and often conducted in a laboratory setting.
  • Observational Research : Observational research involves the systematic observation of subjects or phenomena without intervention. Researchers are passive observers, documenting behaviors, events, or patterns.
  • Survey Research : Surveys are used to collect data through structured questionnaires or interviews. This method is efficient for gathering information from a large number of participants.
  • Case Study Research : Case studies focus on in-depth exploration of one or a few cases. Researchers gather detailed information through various sources such as interviews, documents, and observations.
  • Qualitative Research : Qualitative research aims to understand behaviors, experiences, and opinions in depth. It often involves open-ended questions, interviews, and thematic analysis.
  • Quantitative Research : Quantitative research collects numerical data and relies on statistical analysis to draw conclusions. It involves structured questionnaires, experiments, and surveys.

Your choice of research type should align with your research questions and objectives. Experimental research, for example, is ideal for testing cause-and-effect relationships, while qualitative research is more suitable for exploring complex phenomena.

Experimental Design

Experimental research is a systematic approach to studying causal relationships. It's characterized by the manipulation of one or more independent variables while controlling for other factors. Here are some key aspects of experimental design:

  • Control and Experimental Groups : Participants are randomly assigned to either a control group or an experimental group. The independent variable is manipulated for the experimental group but not for the control group.
  • Randomization : Randomization is crucial to eliminate bias in group assignment. It ensures that each participant has an equal chance of being in either group.
  • Hypothesis Testing : Experimental research often involves hypothesis testing. Researchers formulate hypotheses about the expected effects of the independent variable and use statistical analysis to test these hypotheses.

Observational Design

Observational research entails careful and systematic observation of subjects or phenomena. It's advantageous when you want to understand natural behaviors or events. Key aspects of observational design include:

  • Participant Observation : Researchers immerse themselves in the environment they are studying. They become part of the group being observed, allowing for a deep understanding of behaviors.
  • Non-Participant Observation : In non-participant observation, researchers remain separate from the subjects. They observe and document behaviors without direct involvement.
  • Data Collection Methods : Observational research can involve various data collection methods, such as field notes, video recordings, photographs, or coding of observed behaviors.

Survey Design

Surveys are a popular choice for collecting data from a large number of participants. Effective survey design is essential to ensure the validity and reliability of your data. Consider the following:

  • Questionnaire Design : Create clear and concise questions that are easy for participants to understand. Avoid leading or biased questions.
  • Sampling Methods : Decide on the appropriate sampling method for your study, whether it's random, stratified, or convenience sampling.
  • Data Collection Tools : Choose the right tools for data collection, whether it's paper surveys, online questionnaires, or face-to-face interviews.

Case Study Design

Case studies are an in-depth exploration of one or a few cases to gain a deep understanding of a particular phenomenon. Key aspects of case study design include:

  • Single Case vs. Multiple Case Studies : Decide whether you'll focus on a single case or multiple cases. Single case studies are intensive and allow for detailed examination, while multiple case studies provide comparative insights.
  • Data Collection Methods : Gather data through interviews, observations, document analysis, or a combination of these methods.

Qualitative vs. Quantitative Research

In empirical research, you'll often encounter the distinction between qualitative and quantitative research . Here's a closer look at these two approaches:

  • Qualitative Research : Qualitative research seeks an in-depth understanding of human behavior, experiences, and perspectives. It involves open-ended questions, interviews, and the analysis of textual or narrative data. Qualitative research is exploratory and often used when the research question is complex and requires a nuanced understanding.
  • Quantitative Research : Quantitative research collects numerical data and employs statistical analysis to draw conclusions. It involves structured questionnaires, experiments, and surveys. Quantitative research is ideal for testing hypotheses and establishing cause-and-effect relationships.

Understanding the various research design options is crucial in determining the most appropriate approach for your study. Your choice should align with your research questions, objectives, and the nature of the phenomenon you're investigating.

Data Collection for Empirical Research

Now that you've established your research design, it's time to roll up your sleeves and collect the data that will fuel your empirical research. Effective data collection is essential for obtaining accurate and reliable results.

Sampling Methods

Sampling methods are critical in empirical research, as they determine the subset of individuals or elements from your target population that you will study. Here are some standard sampling methods:

  • Random Sampling : Random sampling ensures that every member of the population has an equal chance of being selected. It minimizes bias and is often used in quantitative research.
  • Stratified Sampling : Stratified sampling involves dividing the population into subgroups or strata based on specific characteristics (e.g., age, gender, location). Samples are then randomly selected from each stratum, ensuring representation of all subgroups.
  • Convenience Sampling : Convenience sampling involves selecting participants who are readily available or easily accessible. While it's convenient, it may introduce bias and limit the generalizability of results.
  • Snowball Sampling : Snowball sampling is instrumental when studying hard-to-reach or hidden populations. One participant leads you to another, creating a "snowball" effect. This method is common in qualitative research.
  • Purposive Sampling : In purposive sampling, researchers deliberately select participants who meet specific criteria relevant to their research questions. It's often used in qualitative studies to gather in-depth information.

The choice of sampling method depends on the nature of your research, available resources, and the degree of precision required. It's crucial to carefully consider your sampling strategy to ensure that your sample accurately represents your target population.

Data Collection Instruments

Data collection instruments are the tools you use to gather information from your participants or sources. These instruments should be designed to capture the data you need accurately. Here are some popular data collection instruments:

  • Questionnaires : Questionnaires consist of structured questions with predefined response options. When designing questionnaires, consider the clarity of questions, the order of questions, and the response format (e.g., Likert scale , multiple-choice).
  • Interviews : Interviews involve direct communication between the researcher and participants. They can be structured (with predetermined questions) or unstructured (open-ended). Effective interviews require active listening and probing for deeper insights.
  • Observations : Observations entail systematically and objectively recording behaviors, events, or phenomena. Researchers must establish clear criteria for what to observe, how to record observations, and when to observe.
  • Surveys : Surveys are a common data collection instrument for quantitative research. They can be administered through various means, including online surveys, paper surveys, and telephone surveys.
  • Documents and Archives : In some cases, data may be collected from existing documents, records, or archives. Ensure that the sources are reliable, relevant, and properly documented.

To streamline your process and gather insights with precision and efficiency, consider leveraging innovative tools like Appinio . With Appinio's intuitive platform, you can harness the power of real-time consumer data to inform your research decisions effectively. Whether you're conducting surveys, interviews, or observations, Appinio empowers you to define your target audience, collect data from diverse demographics, and analyze results seamlessly.

By incorporating Appinio into your data collection toolkit, you can unlock a world of possibilities and elevate the impact of your empirical research. Ready to revolutionize your approach to data collection?

Book a Demo

Data Collection Procedures

Data collection procedures outline the step-by-step process for gathering data. These procedures should be meticulously planned and executed to maintain the integrity of your research.

  • Training : If you have a research team, ensure that they are trained in data collection methods and protocols. Consistency in data collection is crucial.
  • Pilot Testing : Before launching your data collection, conduct a pilot test with a small group to identify any potential problems with your instruments or procedures. Make necessary adjustments based on feedback.
  • Data Recording : Establish a systematic method for recording data. This may include timestamps, codes, or identifiers for each data point.
  • Data Security : Safeguard the confidentiality and security of collected data. Ensure that only authorized individuals have access to the data.
  • Data Storage : Properly organize and store your data in a secure location, whether in physical or digital form. Back up data to prevent loss.

Ethical Considerations

Ethical considerations are paramount in empirical research, as they ensure the well-being and rights of participants are protected.

  • Informed Consent : Obtain informed consent from participants, providing clear information about the research purpose, procedures, risks, and their right to withdraw at any time.
  • Privacy and Confidentiality : Protect the privacy and confidentiality of participants. Ensure that data is anonymized and sensitive information is kept confidential.
  • Beneficence : Ensure that your research benefits participants and society while minimizing harm. Consider the potential risks and benefits of your study.
  • Honesty and Integrity : Conduct research with honesty and integrity. Report findings accurately and transparently, even if they are not what you expected.
  • Respect for Participants : Treat participants with respect, dignity, and sensitivity to cultural differences. Avoid any form of coercion or manipulation.
  • Institutional Review Board (IRB) : If required, seek approval from an IRB or ethics committee before conducting your research, particularly when working with human participants.

Adhering to ethical guidelines is not only essential for the ethical conduct of research but also crucial for the credibility and validity of your study. Ethical research practices build trust between researchers and participants and contribute to the advancement of knowledge with integrity.

With a solid understanding of data collection, including sampling methods, instruments, procedures, and ethical considerations, you are now well-equipped to gather the data needed to answer your research questions.

Empirical Research Data Analysis

Now comes the exciting phase of data analysis, where the raw data you've diligently collected starts to yield insights and answers to your research questions. We will explore the various aspects of data analysis, from preparing your data to drawing meaningful conclusions through statistics and visualization.

Data Preparation

Data preparation is the crucial first step in data analysis. It involves cleaning, organizing, and transforming your raw data into a format that is ready for analysis. Effective data preparation ensures the accuracy and reliability of your results.

  • Data Cleaning : Identify and rectify errors, missing values, and inconsistencies in your dataset. This may involve correcting typos, removing outliers, and imputing missing data.
  • Data Coding : Assign numerical values or codes to categorical variables to make them suitable for statistical analysis. For example, converting "Yes" and "No" to 1 and 0.
  • Data Transformation : Transform variables as needed to meet the assumptions of the statistical tests you plan to use. Common transformations include logarithmic or square root transformations.
  • Data Integration : If your data comes from multiple sources, integrate it into a unified dataset, ensuring that variables match and align.
  • Data Documentation : Maintain clear documentation of all data preparation steps, as well as the rationale behind each decision. This transparency is essential for replicability.

Effective data preparation lays the foundation for accurate and meaningful analysis. It allows you to trust the results that will follow in the subsequent stages.

Descriptive Statistics

Descriptive statistics help you summarize and make sense of your data by providing a clear overview of its key characteristics. These statistics are essential for understanding the central tendencies, variability, and distribution of your variables. Descriptive statistics include:

  • Measures of Central Tendency : These include the mean (average), median (middle value), and mode (most frequent value). They help you understand the typical or central value of your data.
  • Measures of Dispersion : Measures like the range, variance, and standard deviation provide insights into the spread or variability of your data points.
  • Frequency Distributions : Creating frequency distributions or histograms allows you to visualize the distribution of your data across different values or categories.

Descriptive statistics provide the initial insights needed to understand your data's basic characteristics, which can inform further analysis.

Inferential Statistics

Inferential statistics take your analysis to the next level by allowing you to make inferences or predictions about a larger population based on your sample data. These methods help you test hypotheses and draw meaningful conclusions. Key concepts in inferential statistics include:

  • Hypothesis Testing : Hypothesis tests (e.g., t-tests , chi-squared tests ) help you determine whether observed differences or associations in your data are statistically significant or occurred by chance.
  • Confidence Intervals : Confidence intervals provide a range within which population parameters (e.g., population mean) are likely to fall based on your sample data.
  • Regression Analysis : Regression models (linear, logistic, etc.) help you explore relationships between variables and make predictions.
  • Analysis of Variance (ANOVA) : ANOVA tests are used to compare means between multiple groups, allowing you to assess whether differences are statistically significant.

Chi-Square Calculator :

t-Test Calculator :

One-way ANOVA Calculator :

Inferential statistics are powerful tools for drawing conclusions from your data and assessing the generalizability of your findings to the broader population.

Qualitative Data Analysis

Qualitative data analysis is employed when working with non-numerical data, such as text, interviews, or open-ended survey responses. It focuses on understanding the underlying themes, patterns, and meanings within qualitative data. Qualitative analysis techniques include:

  • Thematic Analysis : Identifying and analyzing recurring themes or patterns within textual data.
  • Content Analysis : Categorizing and coding qualitative data to extract meaningful insights.
  • Grounded Theory : Developing theories or frameworks based on emergent themes from the data.
  • Narrative Analysis : Examining the structure and content of narratives to uncover meaning.

Qualitative data analysis provides a rich and nuanced understanding of complex phenomena and human experiences.

Data Visualization

Data visualization is the art of representing data graphically to make complex information more understandable and accessible. Effective data visualization can reveal patterns, trends, and outliers in your data. Common types of data visualization include:

  • Bar Charts and Histograms : Used to display the distribution of categorical data or discrete data .
  • Line Charts : Ideal for showing trends and changes in data over time.
  • Scatter Plots : Visualize relationships and correlations between two variables.
  • Pie Charts : Display the composition of a whole in terms of its parts.
  • Heatmaps : Depict patterns and relationships in multidimensional data through color-coding.
  • Box Plots : Provide a summary of the data distribution, including outliers.
  • Interactive Dashboards : Create dynamic visualizations that allow users to explore data interactively.

Data visualization not only enhances your understanding of the data but also serves as a powerful communication tool to convey your findings to others.

As you embark on the data analysis phase of your empirical research, remember that the specific methods and techniques you choose will depend on your research questions, data type, and objectives. Effective data analysis transforms raw data into valuable insights, bringing you closer to the answers you seek.

How to Report Empirical Research Results?

At this stage, you get to share your empirical research findings with the world. Effective reporting and presentation of your results are crucial for communicating your research's impact and insights.

1. Write the Research Paper

Writing a research paper is the culmination of your empirical research journey. It's where you synthesize your findings, provide context, and contribute to the body of knowledge in your field.

  • Title and Abstract : Craft a clear and concise title that reflects your research's essence. The abstract should provide a brief summary of your research objectives, methods, findings, and implications.
  • Introduction : In the introduction, introduce your research topic, state your research questions or hypotheses, and explain the significance of your study. Provide context by discussing relevant literature.
  • Methods : Describe your research design, data collection methods, and sampling procedures. Be precise and transparent, allowing readers to understand how you conducted your study.
  • Results : Present your findings in a clear and organized manner. Use tables, graphs, and statistical analyses to support your results. Avoid interpreting your findings in this section; focus on the presentation of raw data.
  • Discussion : Interpret your findings and discuss their implications. Relate your results to your research questions and the existing literature. Address any limitations of your study and suggest avenues for future research.
  • Conclusion : Summarize the key points of your research and its significance. Restate your main findings and their implications.
  • References : Cite all sources used in your research following a specific citation style (e.g., APA, MLA, Chicago). Ensure accuracy and consistency in your citations.
  • Appendices : Include any supplementary material, such as questionnaires, data coding sheets, or additional analyses, in the appendices.

Writing a research paper is a skill that improves with practice. Ensure clarity, coherence, and conciseness in your writing to make your research accessible to a broader audience.

2. Create Visuals and Tables

Visuals and tables are powerful tools for presenting complex data in an accessible and understandable manner.

  • Clarity : Ensure that your visuals and tables are clear and easy to interpret. Use descriptive titles and labels.
  • Consistency : Maintain consistency in formatting, such as font size and style, across all visuals and tables.
  • Appropriateness : Choose the most suitable visual representation for your data. Bar charts, line graphs, and scatter plots work well for different types of data.
  • Simplicity : Avoid clutter and unnecessary details. Focus on conveying the main points.
  • Accessibility : Make sure your visuals and tables are accessible to a broad audience, including those with visual impairments.
  • Captions : Include informative captions that explain the significance of each visual or table.

Compelling visuals and tables enhance the reader's understanding of your research and can be the key to conveying complex information efficiently.

3. Interpret Findings

Interpreting your findings is where you bridge the gap between data and meaning. It's your opportunity to provide context, discuss implications, and offer insights. When interpreting your findings:

  • Relate to Research Questions : Discuss how your findings directly address your research questions or hypotheses.
  • Compare with Literature : Analyze how your results align with or deviate from previous research in your field. What insights can you draw from these comparisons?
  • Discuss Limitations : Be transparent about the limitations of your study. Address any constraints, biases, or potential sources of error.
  • Practical Implications : Explore the real-world implications of your findings. How can they be applied or inform decision-making?
  • Future Research Directions : Suggest areas for future research based on the gaps or unanswered questions that emerged from your study.

Interpreting findings goes beyond simply presenting data; it's about weaving a narrative that helps readers grasp the significance of your research in the broader context.

With your research paper written, structured, and enriched with visuals, and your findings expertly interpreted, you are now prepared to communicate your research effectively. Sharing your insights and contributing to the body of knowledge in your field is a significant accomplishment in empirical research.

Examples of Empirical Research

To solidify your understanding of empirical research, let's delve into some real-world examples across different fields. These examples will illustrate how empirical research is applied to gather data, analyze findings, and draw conclusions.

Social Sciences

In the realm of social sciences, consider a sociological study exploring the impact of socioeconomic status on educational attainment. Researchers gather data from a diverse group of individuals, including their family backgrounds, income levels, and academic achievements.

Through statistical analysis, they can identify correlations and trends, revealing whether individuals from lower socioeconomic backgrounds are less likely to attain higher levels of education. This empirical research helps shed light on societal inequalities and informs policymakers on potential interventions to address disparities in educational access.

Environmental Science

Environmental scientists often employ empirical research to assess the effects of environmental changes. For instance, researchers studying the impact of climate change on wildlife might collect data on animal populations, weather patterns, and habitat conditions over an extended period.

By analyzing this empirical data, they can identify correlations between climate fluctuations and changes in wildlife behavior, migration patterns, or population sizes. This empirical research is crucial for understanding the ecological consequences of climate change and informing conservation efforts.

Business and Economics

In the business world, empirical research is essential for making data-driven decisions. Consider a market research study conducted by a business seeking to launch a new product. They collect data through surveys , focus groups , and consumer behavior analysis.

By examining this empirical data, the company can gauge consumer preferences, demand, and potential market size. Empirical research in business helps guide product development, pricing strategies, and marketing campaigns, increasing the likelihood of a successful product launch.

Psychological studies frequently rely on empirical research to understand human behavior and cognition. For instance, a psychologist interested in examining the impact of stress on memory might design an experiment. Participants are exposed to stress-inducing situations, and their memory performance is assessed through various tasks.

By analyzing the data collected, the psychologist can determine whether stress has a significant effect on memory recall. This empirical research contributes to our understanding of the complex interplay between psychological factors and cognitive processes.

These examples highlight the versatility and applicability of empirical research across diverse fields. Whether in medicine, social sciences, environmental science, business, or psychology, empirical research serves as a fundamental tool for gaining insights, testing hypotheses, and driving advancements in knowledge and practice.

Conclusion for Empirical Research

Empirical research is a powerful tool for gaining insights, testing hypotheses, and making informed decisions. By following the steps outlined in this guide, you've learned how to select research topics, collect data, analyze findings, and effectively communicate your research to the world. Remember, empirical research is a journey of discovery, and each step you take brings you closer to a deeper understanding of the world around you. Whether you're a scientist, a student, or someone curious about the process, the principles of empirical research empower you to explore, learn, and contribute to the ever-expanding realm of knowledge.

How to Collect Data for Empirical Research?

Introducing Appinio , the real-time market research platform revolutionizing how companies gather consumer insights for their empirical research endeavors. With Appinio, you can conduct your own market research in minutes, gaining valuable data to fuel your data-driven decisions.

Appinio is more than just a market research platform; it's a catalyst for transforming the way you approach empirical research, making it exciting, intuitive, and seamlessly integrated into your decision-making process.

Here's why Appinio is the go-to solution for empirical research:

  • From Questions to Insights in Minutes : With Appinio's streamlined process, you can go from formulating your research questions to obtaining actionable insights in a matter of minutes, saving you time and effort.
  • Intuitive Platform for Everyone : No need for a PhD in research; Appinio's platform is designed to be intuitive and user-friendly, ensuring that anyone can navigate and utilize it effectively.
  • Rapid Response Times : With an average field time of under 23 minutes for 1,000 respondents, Appinio delivers rapid results, allowing you to gather data swiftly and efficiently.
  • Global Reach with Targeted Precision : With access to over 90 countries and the ability to define target groups based on 1200+ characteristics, Appinio empowers you to reach your desired audience with precision and ease.

Register now EN

Get free access to the platform!

Join the loop 💌

Be the first to hear about new updates, product news, and data insights. We'll send it all straight to your inbox.

Get the latest market research news straight to your inbox! 💌

Wait, there's more

Track Your Customer Retention & Brand Metrics for Post-Holiday Success

19.09.2024 | 8min read

Track Your Customer Retention & Brand Metrics for Post-Holiday Success

Creative Checkup – Optimize Advertising Slogans & Creatives for maximum ROI

16.09.2024 | 10min read

Creative Checkup – Optimize Advertising Slogans & Creatives for ROI

Get your brand Holiday Ready: 4 Essential Steps to Smash your Q4

03.09.2024 | 8min read

Get your brand Holiday Ready: 4 Essential Steps to Smash your Q4

HYPOTHESIS AND THEORY article

Is economics an empirical science if not, can it become one.

\r\nSergio M. Focardi,*

  • 1 Department of Applied Mathematics and Statistics, Stony Brook University, Stony Brook, NY, USA
  • 2 De Vinci Finance Lab, École Supérieure d'Ingénieurs Léonard de Vinci, Paris, France

Today's mainstream economics, embodied in Dynamic Stochastic General Equilibrium (DSGE) models, cannot be considered an empirical science in the modern sense of the term: it is not based on empirical data, is not descriptive of the real-world economy, and has little forecasting power. In this paper, I begin with a review of the weaknesses of neoclassical economic theory and argue for a truly scientific theory based on data, the sine qua non of bringing economics into the realm of an empirical science. But I suggest that, before embarking on this endeavor, we first need to analyze the epistemological problems of economics to understand what research questions we can reasonably ask our theory to address. I then discuss new approaches which hold the promise of bringing economics closer to being an empirical science. Among the approaches discussed are the study of economies as complex systems, econometrics and econophysics, artificial economics made up of multiple interacting agents as well as attempts being made inside present main stream theory to more closely align the theory with the real world.

Introduction

In this paper we analyze the status of economics as a science: Can neoclassical economics be considered an empirical science, eventually only in the making? Are there other approaches that might better bring economics into the realm of empirical science, with the objective of allowing us to forecast (at least probabilistically) economic and market phenomena?

While our discussion is centered on economic theory, our considerations can be extended to finance. Indeed, mainstream finance theory shares the basic framework of mainstream economic theory, which was developed in the 1960s, 1970s in what is called the “rational expectations revolution.” The starting point was the so-called Lucas critique. University of Chicago professor Robert Lucas observed that changes in government policy are made ineffective by the fact that economic agents anticipate these changes and modify their behavior. He therefore advocated giving a micro foundation to macroeconomics—that is, explaining macroeconomics in function of the behavior of individual agents. Lucas was awarded the 1995 Nobel Prize in Economics for his work.

The result of the Lucas critique was a tendency, among those working within the framework of mainstream economic theory, to develop macroeconomic models based on a multitude of agents characterized by rational expectations, optimization and equilibrium 1 . Following common practice we will refer to this economic theory as neoclassical economics or mainstream economics. Mainstream finance theory adopted the same basic principles as general equilibrium economics.

Since the 2007–2009 financial crisis and the ensuing economic downturn—neither foreseen (not even probabilistically) by neoclassical economic theory—the theory has come under increasing criticism. Many observe that mainstream economics provides little knowledge from which we can make reliable forecasts—the objective of science in the modern sense of the term (for more on this, see Fabozzi et al. [ 1 ]).

Before discussing these questions, it is useful to identify the appropriate epistemological framework(s) for economic theory. That is, we need to understand what questions we can ask our theory to address and what types of answers we might expect. If economics is to become an empirical science, we cannot accept terms such as volatility, inflation, growth, recession, consumer confidence, and so on without carefully defining them: the epistemology of economics has to be clarified.

We will subsequently discuss why we argue that neoclassical economic and finance theory is not an empirical science as presently formulated—nor can it become one. We will then discuss new ideas that offer the possibility of bringing economic and finance theory closer to being empirical sciences,—in particular, economics (and finance) based on the analysis of financial time series (e.g., econometrics) and on the theory of complexity. These new ideas might be referred to collectively as “scientific economics.”

We suggest that the epistemological framework of economics is not that of physics but that of complex systems. After all, economies are hierarchical complex systems made up of human agents—complex systems in themselves—and aggregations of human agents. We will argue that giving a micro foundation to macroeconomics is a project with intrinsic limitations, typical of complex systems. These limitations constrain the types of questions we can ask. Note that the notion of economies as complex systems is not really new. Adam Smith's notion of the “invisible hand” is an emerging property of complex markets. Even mainstream economics represents economic systems as complex systems made up of a large number of agents—but it makes unreasonable simplifying assumptions.

For this and other reasons, economics is a science very different from the science of physics. It might be that a true understanding of economics will require a new synthesis of the physical and social sciences given that economies are complex systems made up of human individuals and must therefore take into account human values, a feature that does not appear in other complex systems. This possibility will be explored in the fourth part of our discussion, but let's first start with a review of the epistemological foundations of modern sciences and how our economic theory eventually fits in.

The Epistemological Foundations of Economics

The hallmark of modern science is its empirical character: modern science is based on theories that model or explain empirical observations. No a priori factual knowledge is assumed; a priori knowledge is confined to logic and mathematics and, perhaps, some very general principles related to the meaning of scientific laws and terms and how they are linked to observations. There are philosophical and scientific issues associated with this principle. For example, in his Two Dogmas of Empiricism , van Orman Quine [ 2 ] challenged the separation between logical analytic truth and factual truth. We will herein limit our discussion to the key issues with the bearing on economics. They are:

a. What is the nature of observations?

b. How can empirical theories be validated or refuted?

c. What is the nature of our knowledge of complex systems?

d. What scientific knowledge can we have of mental processes and of systems that depend on human mental processes?

Let's now discuss each of these issues.

What is the Nature of Observations?

The physical sciences have adopted the principles of operationalism as put forward by Bridgman [ 3 ], recipient of 1946 Nobel Prize in Physics, in his book The Logic of Modern Physics . Operationalism holds that the meaning of scientific concepts is rooted in the operations needed to measure physical quantities 2 . Operationalism rejects the idea that there are quantities defined a priori that we can measure with different (eventually approximate) methods. It argues that the meaning of a scientific concept is in how we observe (or measure) it.

Operationalism has been criticized on the basis that science, in particular physics, uses abstract terms such as “mass” or “force,” that are not directly linked to a measurement process. See, for example, Hempel [ 4 ]. This criticism does not invalidate operationalism but requires that operationalism as an epistemological principle be interpreted globally . The meaning of a physical concept is not given by a single measurement process but by the entire theory and by the set of all observations. This point of view has been argued by many philosophers and scientists, including Feyerabend [ 5 ], Kuhn [ 6 ], and van Orman Quine [ 2 ].

But how do we define “observations”? In physics, where theories have been validated to a high degree of precision, we accept as observations quantities obtained through complex, theory-dependent measurement processes. For example, we observe temperature through the elongation of a column of mercury because the relationship between the length of the column of mercury and temperature is well-established and coherent with other observations such as the change of electrical resistance of a conductor. Temperature is an abstract term that enters in many indirect observations, all coherent.

Contrast the above to economic and finance theory where there is a classical distinction between observables and hidden variables. The price of a stock is an observable while the market state of low or high volatility is a hidden variable. There are a plethora of methods to measure volatility, including: the ARCH/GARCH family of models, stochastic volatility, and implied volatility. All these methods are conceptually different and yield different measurements. Volatility would be a well-defined concept if it were a theoretical term that is part of a global theory of economics or finance. But in economics and finance, the different models to measure volatility use different concepts of volatility.

There is no global theory that effectively links all true observations to all hidden variables. Instead, we have many individual empirical statements with only local links through specific models. This is a significant difference with respect to physical theories; it weakens the empirical content of economic and finance theory.

Note that the epistemology of economics is not (presently) based on a unified theory with abstract terms and observations. It is, as mentioned, based on many individual facts. Critics remark that mainstream economics is a deductive theory, not based on facts. This is true, but what would be required is a deductive theory based on facts . Collections of individual facts, for example financial time series, have, as mentioned, weak empirical content.

How Can Empirical Theories Be Validated or Refuted?

Another fundamental issue goes back to the eighteenth century, when the philosopher-economist David Hume outlined the philosophical principles of Empiricism. Here is the issue: No finite series of observations can justify the statement of a general law valid in all places and at all times, that is, scientific laws cannot be validated in any conclusive way. The problem of the validation of empirical laws has been widely debated; the prevailing view today is that scientific laws must be considered hypotheses validated by past data but susceptible of being invalidated by new observations. That is, scientific laws are hypotheses that explain (or model) known data and observations but there is no guarantee that new observations will not refute these laws. The attention has therefore shifted from the problem of validation to the problem of rejection.

That scientific theories cannot be validated but only refuted is the key argument in Carl Popper's influential Conjectures and Refutations: The Growth of Scientific Knowledge . Popper [ 7 ] argued that scientific laws are conjectures that cannot be validated but can be refuted. Refutations, however, are not a straightforward matter: Confronted with new empirical data, theories can, to some extent, be stretched and modified to accommodate the new data.

The issue of validation and refutation is particularly critical in economics given the paucity of data. Financial models are validated with a low level of precision in comparison to physical laws. Consider, for example, the distribution of returns. It is known that returns at time horizons from minutes to weeks are not normally distributed but have tails fatter than those of a normal distribution. However, the exact form of returns distributions is not known. Current models propose a range from inverse power laws with a variety of exponents to stretched exponentials, but there is no consensus.

Economic and financial models—all probabilistic models—are validated or refuted with standard statistical procedures. This leaves much uncertainty given that the choice among the models and the parameterization of different models are subject to uncertainty. And in most cases there is no global theory.

Economic and financial models do not have descriptive power. This point was made by Friedman [ 8 ], University of Chicago economist and recipient of the 1976 Nobel Prize in Economics. Friedman argued that economic models are like those in physics, that is to say, mathematical tools to connect observations. There is no intrinsic rationality in economic models. We must resist the temptation to think that there are a priori truths in economic reasoning.

In summary, economic theories are models that link observations without any pretense of being descriptive. Their validation and eventual rejection are performed with standard statistical methods. But the level of uncertainty is great. As famously observed by Black [ 9 ] in his article “Noise,” “Noise makes it very difficult to test either practical or academic theories about the way that financial or economic markets work. We are forced to act largely in the dark.” That is to say, there is little evidence that allows us to choose between different economic and financial models.

What is the Nature of Our Knowledge of Complex Systems?

The theory of complex systems has as its objective to explain the behavior of systems made up of many interacting parts. In our Introduction, we suggested that the theory of complexity might be relevant to the analysis of economies and financial time series. The key theoretical questions are:

The first question is essentially the following: Can we give a micro foundation to economics? The second question asks: How do economies develop, grow, and transform themselves? The last question is: Using the theory of complex systems, what type of economic theory we can hope to develop?

The principle of scientific reductionism holds that the behavior of any physical system can be reduced to basic physical laws. In other words, reductionism states that we can logically describe the behavior of any physical system in terms of its basic physical laws. For example, the interaction of complex molecules (such as molecules of drugs and target molecules with which they are supposed to interact) can in principle be described by quantum mechanical theories. The actual computation might be impossibly long in practice but, in theory, the computation should be possible.

Does reductionism hold for very complex physical systems? Can any property of a complex system be mathematically described in terms of basic physical laws? Philip Warren Anderson, co-recipient of the 1977 Nobel Prize in Physics, conjectured in his article “More is different” [ 10 ] that complex systems might exhibit properties that cannot be explained in terms of microscopic laws. This does not mean that physical laws are violated in complex systems; rather it means that in complex systems there are aggregate properties that cannot be deduced with a finite chain of logical deductions from basic laws. This impossibility is one of the many results on the limits of computability and the limits of logical deductions that were discovered after the celebrated theorem of Goedel on the incompleteness of formal logical systems.

Some rigorous results can be obtained for simple systems. For example, Gu et al. [ 11 ] demonstrated that an infinite Ising lattice exhibits properties that cannot be computed in any finite time from the basic laws governing the behavior of the lattice. This result is obtained from well-known results of the theory of computability. In simple terms, even if basic physical laws are valid for any component of a complex system, in some cases the chains of deduction for modeling the behavior of aggregate quantities become infinite. Therefore, no finite computation can be performed.

Reductionism in economic theory is the belief that we can give a micro foundation to macroeconomics, that is, that we can explain aggregate economic behavior in terms of the behavior of single agents. As mentioned, this was the project of economic theory following the Lucas critique. As we will see in Section What Is the Cognitive Value of Neoclassical Economics, Neoclassical Finance? this project produced an idealized concept of economies, far from reality.

We do not know how economic agents behave, nor do we know if and how their behavior can be aggregated to result in macroeconomic behavior. It might well be that the behavior of each agent cannot be computed and that the behavior of the aggregates cannot be computed in terms of individuals. While the behavior of agents has been analyzed in some experimental setting, we are far from having arrived at a true understanding of agent behavior.

This is why a careful analysis of the epistemology of economics is called for. If our objective is to arrive at a science of economics, we should ask only those questions that we can reasonably answer, and refrain from asking questions and formulating theories for which there is no possible empirical evidence or theoretical explanation.

In other words, unless we make unrealistic simplifications, giving a micro foundation to macroeconomics might prove to be an impossible task. Neoclassical economics makes such unrealistic simplifications. A better approximation to a realistic description of economics might be provided by agent-based systems, which we will discuss later. But agent-based systems are themselves complex systems: they do not describe mathematically, rather they simulate economic reality. A truly scientific view of economics should not be dogmatic, nor should it assume that we can write an aggregate model based on micro-behavior.

Given that it might not be possible to describe the behavior of complex systems in terms of the laws of their components, the next relevant question is: So how can we describe complex systems? Do complex systems obey deterministic laws dependent on the individual structure of each system, which might be discovered independently from basic laws (be they deterministic or probabilistic)? Or do complex systems obey statistical laws? Or is the behavior of complex systems simply unpredictable?

It is likely that there is no general answer to these questions. A truly complex system admits many different possible descriptions in function of the aggregate variables under consideration. It is likely that some aggregate properties can be subject to study while others are impossible to describe. In addition, the types of description might vary greatly. Consider the emission of human verbal signals (i.e., speech). Speech might indeed have near deterministic properties in terms of the formation rules, grammar, and syntax. If we move to the semantic level, speech has different laws in function of cultures and domains of interest which we might partially describe. But modeling the daily verbal emissions of an individual likely remains beyond any mathematical and computational capability. Only broad statistics can be computed.

If we turn to economics, when we aggregate output in terms of prices, we see that the growth of the aggregate output is subject to constraints, such as the availability of money, that make the quantitative growth of economies at least partially forecastable. Once more, it is important to understand for what questions we might reasonably obtain an answer.

There are additional fundamental questions regarding complex systems. For example, as mentioned above: Can global properties spontaneously emerge in non-complex systems and if so, how? There are well-known examples of simple self-organizing systems such as unsupervised neural networks. But how can we explain the self-organizing properties of systems such as economic systems or financial markets? Can it be explained in terms of fundamental laws plus some added noise? While simple self-organizing behavior has been found in simulated systems, explaining the emergence and successive evolution of very complex systems remains unresolved.

Self-organization is subject to the same considerations made above regarding the description of complex systems. It is well possible that the self-organization of highly complex systems cannot be described in terms of the basic laws or rules of behavior of the various components. In some systems, there might be no finite chain of logical deductions able to explain self-organization. This is a mathematical problem, unrelated to the failure or insufficiency of basic laws. Explaining self-organization becomes another scientific problem.

Self-organization is a key concept in economics. Economies and markets are self-organizing systems whose complexity has increased over thousands of years. Can we explain this process of self-organization? Can we capture the process that makes economies and markets change their own structure, adopt new models of interaction?

No clear answer is yet available. Chaitin [ 12 ] introduced the notion of metabiology and has suggested that it is possible to provide a mathematical justification for Darwinian evolution. Arguably one might be able to develop a “metaeconomics” and provide some clue as to how economies or markets develop 3 .

Clearly there are numerous epistemological questions related to the self-organization of complex systems. Presently these questions remain unanswered at the level of scientific laws. Historians, philosophers, and social scientists have proposed many explanations of the development of human societies. Perhaps the most influential has been Hegel's Dialectic, which is the conceptual foundation of Marxism. But these explanations are not scientific in the modern sense of the term.

It is not obvious that complex systems can be handled with quantitative laws. Laws, if they exist, might be of a more general logical nature (e.g., logical laws). Consider the rules of language—a genuine characteristic of the complex system that is the human being: there is nothing intrinsically quantitative. Nor are DNA structures intrinsically quantitative. So with economic and market organization: they are not intrinsically quantitative.

What Scientific Knowledge Can We Have of Our Mental Processes and of Systems That Depend on Them?

We have now come to the last question of importance to our discussion of the epistemological foundations of modern science: What is the place of mental experience in modern science? Can we model the process through which humans make decisions? Or is human behavior essentially unpredictable? The above might seem arcane philosophical or scientific speculation, unrelated to economics or finance. Perhaps, Except that whether or not economics or finance can be studied as a science depends, at least to some extent, on if and how human behavior can be studied as a science.

Human decision-making shapes the course of economies and financial markets: economics and finance can become a science if human behavior can be scientifically studied, at least at some level of aggregation or observability. Most scientific efforts on human behavior have been devoted to the study of neurodynamics and neurophysiology. We have acquired a substantial body of knowledge on how mental tasks are distributed to different regions of the brain. We also have increased our knowledge of the physiology of nervous tissues, of the chemical and electrical exchanges between nervous cells. This is, of course, valuable knowledge from both the practical and the theoretical points of view.

However, we are still far from having acquired any real understanding of mental processes. Even psychology, which essentially categorizes mental events as if they were physical objects, has not arrived at an understanding of mental events. Surely we now know a lot on how different chemicals might affect mental behavior, but we still have no understanding of the mental processes themselves. For example, a beam of light hits a human eye and the conscious experience of a color is produced. How does this happen? Is it the particular structure of molecules in nerve cells that enables vision? Can we really maintain that structure “generates” consciousness? Might consciousness be generated through complex structures, for example with computers? While it is hard to believe that structure in itself creates consciousness, consciousness seems to appear only in association with very complex structures of nerve cells.

John von Neumann [ 13 ] was the first to argue that brains, and more in general any information processing structure, can be represented as computers. This has been hotly debated by scientists and philosophers and has captured popular imagination. In simple terms, the question is: Can computers think? In many discussions, there is a more or less explicit confusion between thinking as the ability to perform tasks and thinking as having an experience.

It was Alan Turing who introduced a famous test to determine if a machine is intelligent. Turing [ 14 ] argued that if a machine can respond to questions as a human would do, then the machine has to be considered intelligent. But Turing's criterion says nothing as regards the feelings and emotions of the machine.

In principle we should be able to study human behavior just as we study the behavior of a computer as both brain and computer are made of earthly materials. But given the complexity of the brain, it cannot be assumed that we can describe the behavior of functions that depend on the brain, such as economic or financial decision-making, with a mathematical or computational model.

A key question in studying behavior is whether we have to include mental phenomena in our theories. The answer is not simple. It is common daily experience that we make decisions based on emotions. In finance, irrational behavior is a well-known phenomenon (see Shiller [ 15 ]). Emotions, such as greed or fear, drive individuals to “herd” in and out of investments collectively, thereby causing the inflation/deflation of asset prices.

Given that we do not know how to represent and model the behavior of complex systems such as the brain, we cannot exclude that mental phenomena are needed to explain behavior. For example, a trader sees the price of a stock decline rapidly and decides to sell. It is possible that we will not be able to explain mathematically this behavior in terms of physics alone and have to take into account “things” such as fear. But for the moment, our scientific theories cannot include mental events.

Let's now summarize our discussion on the study of economies or markets as complex systems. We suggested that the epistemological problems of the study of economies and markets are those of complex systems. Reductionism does not always work in complex systems as the chains of logical deductions needed to compute the behavior of aggregates might become infinite. Generally speaking, the dynamics of complex systems needs to be studied in itself, independent of the dynamics of the components. There is no unique way to describe complex systems: a multitude of descriptions corresponding to different types of aggregation are possible. Then there are different ways of looking at complex systems, from different levels of aggregation and from different conceptual viewpoints. Generally speaking, truly complex systems cannot be described with a single set of laws. The fact that economies and markets are made up of human individuals might require the consideration of mental events and values. If, as we suggest, the correct paradigm for understanding economies and finance is that of complex systems, not that of physics, the question of how to do so is an important one. We will discuss this in Section Econophysics and Econometrics.

Table 1 summarizes our discussion of the epistemology of economics.

www.frontiersin.org

Table 1. Similarities and differences between the physical sciences, mainstream economics, and hypothetical future scientific economics .

What Is the Cognitive Value of Neoclassical Economics, Neoclassical Finance?

As discussed above, there are difficult epistemological questions related to the study of economics and finance, questions that cannot be answered in a naïve way, questions such as “What do we want to know?”, “What can we know?”, or questions that cannot be answered at all or that need to be reformulated. Physics went through a major conceptual crisis when it had to accept that physical laws are probabilistic and do not describe reality but are subject to the superposition of events. In economics and finance, we have to think hard to (re)define our questions—they might not be as obvious we think: economies and markets are complex systems; the problems must be carefully conceptualized. We will now analyze the epistemological problems of neoclassical economics and, by extension, of neoclassical finance.

First, what is the cognitive value of neoclassical economics? As observed above, neoclassical economics is not an empirical science in the modern sense of the term; rather neoclassical economics is a mathematical model of an idealized object that is far from reality. We might say that the neoclassical economics has opened the way to the study of economics as a complex system made of agents which are intelligent processors of information, capable of making decisions on the basis of their forecasts. However, its spirit is very different from that of complex systems theory.

Neoclassical economics is essentially embodied in Dynamic Stochastic General Equilibrium (DSGE) models, the first of which was developed by Fynn Kidland and Edward Prescott, co-recipients of the 2004 Nobel Prize in Economics. DSGEs were created to give a micro foundation to macroeconomics following the Lucas critique discussed above. In itself sensible, anchoring macro behavior to micro-behavior, that is, explaining macroeconomics in terms of the behavior of individuals (or agents), is a challenging scientific endeavor. Sensible because effectively the evolution of economic quantities depends ultimately on the decisions made by individuals using their knowledge but also subject to emotions. Challenging, however, because (1) rationalizing the behavior of individuals is difficult, perhaps impossible and (2) as discussed above, it might not be possible to represent mathematically (and eventually to compute) the aggregate behavior of a complex system in terms of the behavior of its components.

These considerations were ignored in the subsequent development of economic and finance theory. Instead of looking with scientific humility to the complexity of the problem, an alternative idealized model was created. Instead of developing as a science, mainstream economics developed as highly sophisticated mathematical models of an idealized economy. DSGEs are based on three key assumptions:

From a scientific point of view, the idealizations of neoclassical economics are unrealistic. No real agent can be considered a rational-expectations agent: real agents do not have perfect knowledge of future expectations. The idea of rational expectations was put forward by Muth [ 16 ] who argued that, on average, economic agents make correct forecasts for the future—clearly a non-verifiable statement: we do not know the forecasts made by individual agents and therefore cannot verify if their mean is correct. There is little conceptual basis in arguing that, on average, people make the right forecasts of variables subject to complex behavior. It is obvious that, individually, we are uncertain about the future; we do not even know what choices we will have to make in the future.

There is one instance where average expectations and reality might converge, at least temporarily. This occurs when expectations change the facts themselves, as happens with investment decision-making where opinions lead to decisions that confirm the opinions themselves. This is referred to as “self-reflectivity.” For example, if on average investors believe that the price of a given stock will go up, they will invest in that stock and its price will indeed go up, thereby confirming the average forecast. Investment opinions become self-fulfilling prophecies. However, while investment opinions can change the price of a stock, they cannot change the fundamentals related to that stock. If investment opinions were wrong because based on a wrong evaluation of fundamentals, at a certain point opinions, and subsequently the price, will change.

Forecasts are not the only problem with DSGEs. Not only do we not know individual forecasts, we do not know how agents make decisions. The theoretical understanding of the decision-making process is part of the global problem of representing and predicting behavior. As observed above, individual behavior is the behavior of a complex system (in this case, the human being) and might not be predictable, not even theoretically. Let's refine this conclusion: We might find that, at some level of aggregation, human behavior is indeed predictable—at least probabilistically.

One area of behavior that has been mathematically modeled is the process of rational decision-making. Rational decision-making is a process of making coherent decisions. Decisions are coherent if they satisfy a number of theoretical axioms such as if choice A is preferred to choice B and choice B is preferred to choice C, then choice A must be preferred to choice A. Coherent decisions can be mathematically modeled by a utility function, that is a function defined on every choice such that choice A is preferred to choice B if the utility of A is higher than the utility of B. Utility is purely formal. It is not unique: there are infinite utility functions that correspond to the same ordering of decisions.

The idea underlying the theory of decisions and utility functions is that the complexity of human behavior, the eventual free will that might characterize the decision-making process, disappears when we consider simple business decisions such as investment and consumption. That is, humans might be very complex systems but when it comes to questions such as investments, they all behave in the same formal way, i.e., they all maximize the utility function.

There are several difficulties with representing agent decision-making as utility maximization. First, real agents are subjects to many influences, mutual interactions and other influences that distort their decisions. Numerous studies of empirical finance have shown that people do not behave according to the precepts of rational decision making. Although DSGEs can be considered a major step forward in economics, the models are, in fact, mere intellectual constructions of idealized economies. Even assuming that utility maximization does apply, there is no way to estimate the utility function(s) of each agent. DSGEs do not describe real agents nor do they describe how expectations are formed. Real agents might indeed make decisions based on past data from which they might make forecasts; they do not make decisions based on true future expectations. Ultimately, a DSGE model in its original form cannot be considered scientific.

Because creating models with a “realistic” number of agents (whatever that number might be) would be practically impossible, agents are generally collapsed into a single representative agent by aggregating utility functions. However, as shown by Sonnensheim-Mantel-Debreu [ 17 ], collapsing agents into a single representative agent does not preserve the conditions that lead to equilibrium. Despite this well-known theoretical result, models used by central banks and other organizations represent economic agents with a single aggregate utility functional.

Note that the representative agent is already a major departure from the original objective of Lucas, Kidland, and Prescott to give a micro foundation to macroeconomics. The aggregate utility functional is obviously not observable. It is generally assumed that the utility functional has a convenient mathematical formulation. See for example, Smets and Wouters [ 18 ] for a description of one such model.

There is nothing related to the true microstructure of the market in assuming a global simple utility function for an entire economy. In addition, in practice DSGE models assume simple processes, such as AutoRegressive processes, to make forecasts of quantities, as, for example, in Smets and Wouters [ 18 ] cited above.

What remains of the original formulation of the DSGE is the use of Euler equations and equilibrium conditions. But this is only a mathematical formalism to forecast future quantities. Ultimately, in practice, DSGE models are models where a simple utility functional is maximized under equilibrium conditions, using additional ad hoc equations to represent, for example, production.

Clearly, in this formulation DSGEs are abstract models of an economy without any descriptive power. Real agents do not appear. Even the forward-looking character of rational expectations is lost because these models obviously use only past data which are fed to algorithms that include terms such as utility functions that are assumed, not observed.

How useful are these models? Empirical validation is limited. Being equilibrium models, DSGEs cannot predict phenomena such as boom-bust business cycles. The most cogent illustration has been their inability to predict recent stock market crashes and the ensuing economic downturns.

Econophysics and Econometrics

In Section The Epistemological Foundations of Economics, we discussed the epistemological issues associated with economics; in Section What Is the Cognitive Value of Neoclassical Economics, Neoclassical Finance? We critiqued mainstream economics, concluding that it is not an empirical scientific theory. In this and the following section we discuss new methods, ideas, and results that are intended to give economics a scientific foundation as an empirical science. We will begin with a discussion of econophysics and econometrics.

The term Econophysics was coined in 1995 by the physicist Eugene Stanley. Econophysics is an interdisciplinary research effort that combines methods from physics and economics. In particular, it applies techniques from statistical physics and non-linear dynamics to the study of economic data and it does so without the pretense of any a priori knowledge of economic phenomena.

Econophysics obviously overlaps the more traditional discipline of econometrics. Indeed, it is difficult to separate the two in any meaningful way. Econophysics also overlaps economics based on artificial markets formed by many interacting agents. Perhaps a distinguishing feature of econophysics is its interdisciplinarity, though one can reasonably argue that any quantitative modeling of financial or economic phenomena shares techniques with other disciplines. Another distinguishing feature is its search for universal laws; econometrics is more opportunistic. However, these distinctions are objectively weak. Universality in economics is questionable and econometrics uses methods developed in pure mathematics.

To date, econophysics has focused on analyzing financial markets. The reason is obvious: financial markets generate huge quantities of data. The availability of high-frequency data and ultra-high-frequency data (i.e., tick-by-tick data) has facilitated the use of the methods of physics. For a survey of Econophysics, see in particular Lux [ 19 ] and Chakraborti et al. [ 20 ]; see Gallegati et al. [ 21 ] for a critique of econophysics from inside.

The main result obtained to date by econophysics is the analysis and explanation of inverse power law distributions empirically found in many economic and financial phenomena. Power laws have been known and used for more than a century, starting with the celebrated Pareto law of income distribution. Power law distributions were proposed in finance in the 1950s, for example by Mandelbrot [ 22 ]. More recently, econophysics has performed a systematic scientific study of power laws and their possible explanations.

Time series or cross sectional data characterized by inverse power law distributions have special characteristics that are important for economic theory as well as for practical applications such as investment management. Inverse power laws characterize phenomena such that very large events are not negligibly rare. The effect is that individual events, or individual agents, become very important.

Diversification, which is a pillar of classical finance and investment management, becomes difficult or nearly impossible if distributions follow power laws. Averages lose importance as the dynamics of phenomena is dominated by tail events. Given their importance in explaining economic and financial phenomena, we will next briefly discuss power laws.

Let's now look at the mathematical formulation of distributions characterized by power laws and consider some examples of how they have improved our understanding of economic and financial phenomena.

The tails of the distribution of a random variable r follow an inverse power law if the probability of the tail region decays hyperbolically:

The properties of the distribution critically depend on the magnitude of the exponent α. Values α < 2 characterize Levy distributions with infinite variance and infinite mean if α < 1; values α > 2 characterize distributions with finite mean and variance.

Consider financial returns. Most studies place the value of α for financial returns at around 3 [ 19 ]. This finding is important because values α < 2 would imply invariance of the distribution, and therefore of the exponent, with respect to summation of variables. That is, the sum of returns would have the same exponent of the summands. This fact would rule out the possibility of any diversification and would imply that returns at any time horizon have the same distribution. Instead, values of α at around 3 imply that variables become normal after temporal aggregation on sufficiently long time horizons. This is indeed what has been empirically found: returns become normal over periods of 1 month or more.

Power laws have also been found in the autocorrelation of volatility. In general the autocorrelation of returns is close to zero. However, the autocorrelation of volatility, (measured by the autocorrelation of the absolute value of returns, or the square of returns), decays as an inverse power law:

The typical exponent found empirically is 0.3. Such a small exponent implies a long-term dependence of volatility.

Power laws have been found in other phenomena. Trading volume decays as a power law; power laws have also been found in the volume and number of trades per time unit in high-frequency data; other empirical regularities have been observed, especially for high-frequency data.

As for economic phenomena more in general, power laws have been observed in the distribution of loans, the market capitalization of firms, and income distribution - the original Pareto law. Power laws have also been observed in non-financial phenomena such as the distribution of the size of cities 4 .

Power law distributions of returns and of volatility appear to be a universal feature in all liquid markets. But Gallegati et al. [ 21 ] suggest that the supposed universality of these empirical findings (which do not depend on theory) is subject to much uncertainty. In addition, there is no theoretical reason to justify the universality of these laws. It must be said that the level of validation of each finding is not extraordinarily high. For example, distributions other than power laws have been proposed, including stretched exponentials and tempered distributions. There is no consensus on any of these findings. As observed in Section The Epistemological Foundations of Economics, the empirical content of individual findings is low; it is difficult to choose between the competing explanations.

Attempts have been made to offer theoretical explanations for the findings of power laws. Econophysicists have tried to capture the essential mechanisms that generate power laws. For example, it has been suggested that power laws in one variable naturally lead to power laws in other variables. In particular, power law distributions of the size of market capitalization or similar measures of the weight of investors explain most other financial power laws [ 23 ]. The question is: How were the original power laws generated?

Two explanations have been proposed. The first is based on non-linear dynamic models. Many, perhaps most, non-linear models create unconditional power law distributions. For example, the ARCH/GARCH models create unconditional power laws of the distribution of returns though the exponents do not fit empirical data. The Lux-Marchesi dynamic model of trading [ 24 ] was the first model able to explain power laws of both returns and autocorrelation time. Many other dynamic models have since been proposed.

A competing explanation is based on the properties of percolation structures and random graph theory, as originally proposed by Cont and Bouchaud [ 25 ]. When the probability of interaction between adjacent nodes approaches a critical value that depends on the topology of the percolation structure or the random graph, the distribution of connected components follows a power law. Assuming financial agents can be represented by the nodes of a random graph, demand created by aggregation produces a fat-tailed distribution of returns.

Random Matrices

Econophysics has also obtained important results is in the analysis of large covariance and correlation matrices, separating noise from information. For example, financial time series of returns are weakly autocorrelated but strongly correlated. Correlation matrices play a fundamental role in portfolio management and many other financial applications.

However, estimating correlation and covariance matrices for large markets is problematic due to the fact that the number of parameters to estimate (i.e., the entries of the covariance matrix) grows with the square of the number of time series, while the number of available data is only proportional to the number of time series. In practice, the empirical estimator of a large covariance matrix is very noisy and cannot be used. Borrowing from physics, econophysicists suggest a solution based on the theory of random matrices which has been applied to solve problems in quantum physics.

The basic idea of random matrices is the following. Consider a sample of N time series of length T . Suppose the series are formed by independent and identically distributed zero mean normal variables. In the limit of N and T going to infinity with a constant ratio Q = T/N , the distribution of the eigenvalues of these series was determined by Marchenko Pastur [ 26 ]. Though the law itself is a simple algebraic function, the demonstration is complicated. The remarkable finding is that there is a universal theoretical distribution of eigenvalues in an interval that depends only on Q .

This fact suggested a method for identifying the number of meaningful eigenvalues of a large covariance matrix: Only those eigenvalues that are outside the interval of the Marcenko-Pastur law are significant (see Plerou et al. [ 27 ]). A covariance matrix can therefore be made robust computing the Principal Components Analysis (PCA) and using only those principal components corresponding to meaningful eigenvalues. Random matrix theory has been generalized to include correlated and autocorrelated time series and non-normal distributions (see Burda et al. [ 28 ]).

Econometrics and VAR Models

As mentioned above, the literature on econophysics overlaps with the econometric literature. Econometricians have developed methods to capture properties of time series and model their evolution. Stationarity, integration and cointegration, and the shifting of regimes are properties and models that come from the science of econometrics.

The study of time series has opened a new direction in the study of economics with the use of Vector Auto Regressive (VAR) models. VAR models were proposed by Christopher Sims in the 1980s (for his work, Sims shared the 2011 Nobel Prize in Economics with Thomas Sargent). Given a vector of variables, a VAR model represents the dynamic of the variables as the regression of each variable over lagged values of all variables:

The use of VAR models in economics is typically associated with dimensionality reduction techniques. As currently tens or even hundreds of economic time series are available, PCA or similar techniques are used to reduce the number of variables so that the VAR parameters can be estimated.

What are the similarities and differences between econometrics and econophysics? Both disciplines try to find mathematical models of economic and/or financial variables. One difference between the two disciplines is perhaps the fact that econophysics attempts to find universal phenomena shared by every market. Econometricians, on the other hand, develop models that can be applied to individual time series without considering their universality. Hence econometricians focus on methods of statistical testing because the applicability of models has to be tested in each case. This distinction might prove to be unimportant as there is no guarantee that we can find universal laws. Thus far, no model has been able to capture all the features of financial time series: Because each model requires an independent statistical validation, the empirical content is weak.

New Directions in Economics

We will now explore some new directions in economic theory. Let's start by noting that we do not have a reasonably well-developed, empirically validated theory of economics. Perhaps the most developed field is the analysis of instabilities as well as economic simulation. The main lines of research, however, are clear and represent a departure from the neoclassical theory. They can be summarized thus:

1. Social values and objectives must be separated from economic theory, that is, we have to separate political economics from pure economic theory. Economies are systems in continuous evolution. This fact is not appreciated in neoclassical economics which considers only aggregated quantities.

2. The output of economies is primarily the creation of order and complexity, both at the level of products and social structures. Again, this fact is ignored by neoclassical economics, which takes a purely quantitative approach without considering changes in the quality of the output or the power structure of economies.

3. Economies are never in a state of equilibrium, but are subject to intrinsic instabilities.

4. Economic theory needs to consider economies as physical systems in a physical environment; it therefore needs to take into consideration environmental constraints.

Let's now discuss how new directions in economic theory are addressing the above.

Economics and Political Economics

As mentioned above, economic theory should be clearly separated from political economics. Economies are human artifacts engineered to serve a number of purposes. Most economic principles are not laws of nature but reflect social organization. As in any engineering enterprise, the engineering objectives should be kept separate from the engineering itself and the underlying engineering principles and laws. Determining the objectives is the realm of political economics; engineering the objectives is the realm of economic theory.

One might object that there is a contradiction between the notion of economies as engineered artifacts and the notion of economies as evolving systems subject to evolutionary rules. This contradiction is akin to the contradiction between studying human behavior as a mechanistic process and simultaneously studying how to improve ourselves.

We will not try to solve this contradiction at a fundamental level. Economies are systems whose evolution is subject to uncertainty. Of course the decisions we make about engineering our economies are part of the evolutionary process. Pragmatically, if not philosophically, it makes sense to render our objectives explicit.

For example, Acemoglu, Robinson, and Verdier wrote an entry in the VOX CEPR's Policy Portal ( http://www.voxeu.org/article/cuddly-or-cut-throat-capitalism-choosing-models-globalised-world ) noting that we have the option to choose between different forms of capitalism (see, for example, in Hall and Soskice [ 29 ]), in particular, between what they call “cuddly capitalism” or “cut-throat capitalism.” It makes sense, pragmatically, to debate what type of system, in this case of capitalism, we want. An evolutionary approach, on the other hand, would study what decisions were/will be made.

The separation between objectives and theory is not always made clear, especially in light of political considerations. Actually, there should be multiple economic theories corresponding to different models of economic organization. Currently, however, the mainstream model of free markets is the dominant model; any other model is considered either an imperfection of the free-market competitive model or a failure, for example Soviet socialism. This is neither a good scientific attitude nor a good engineering approach. The design objectives of our economies should come first, then theory should provide the tools to implement the objectives.

New economic thinking is partially addressing this need. In the aftermath of the 2007–2009 financial crisis and the subsequent questioning of mainstream economics, some economists are tackling socially-oriented issues, in particular, the role and functioning of the banking system, the effect of the so-called austerity measures, and the social and economic implications of income and wealth inequality.

There is a strain of economic literature, albeit small, known as meta-economics, that is formally concerned with the separation of the objectives and the theory in economics. The term metaeconomics was first proposed by Karl Menger, an Austrian mathematician and member of the Vienna Circle 5 . Influenced by David Hilbert's program to give a rigorous foundation to mathematics, Menger proposed metaeconomics as a theory of the logical structure of economics.

The term metaeconomics was later used by Schumacher [ 30 ] to give a social and ethical foundation to economics, and is now used in this sense by behavioral economists. Metaeconomics, of course, runs contrary to mainstream economics which adheres to the dogma of optimality and excludes any higher-level discussion of objectives.

Economies as Complex Evolving Systems

What are the characteristics of evolutionary complex systems such as our modern economies? An introduction can be found in Beinhocker [ 31 ]. Associated with the Institute for New Economic Thinking (INET) 6 , Beinhocker attributes to Nicholas Georgescu-Roegen many of the new ideas in economics that are now receiving greater attention.

Georgescu-Roegen [ 32 ] distinguishes two types of evolution, slow biological evolution and the fast cultural evolution typical of modern economies. Thus, the term bioeconomics . The entropy accounting of the second law of thermodynamics implies that any local increase of order is not without a cost: it requires energy and, in the case of the modern economies, produces waste and pollution. Georgescu-Roegen argued that because classical economics does not take into account the basic laws of entropy, it is fundamentally flawed.

When Georgescu-Roegen first argued his thesis back in the 1930s, economists did not bother to respond. Pollution and depletion of natural resources were not on any academic agenda. But if economics is to become a scientific endeavor, it must consider the entropy accounting of production. While now much discussed, themes such as energy sources, sustainability, and pollution are still absent from the considerations of mainstream economics.

It should be clear that these issues cannot be solved with a mathematical algorithm. As a society, we are far from being able, or willing, to make a reasonable assessment of the entropy balance of our activities, economic and other. But a science of economics should at least be able to estimate (perhaps urgently) the time scales of these processes.

Economic growth and wealth creation are therefore based on creating order and complexity. Understanding growth, and eventually business cycles and instabilities, calls for an understanding of how complexity evolves—a more difficult task than understanding the numerical growth of output.

Older growth theories were based on simple production functions and population growth. Assuming that an economy produces a kind of composite good, with appropriate production functions, one can demonstrate that, setting aside capital, at any time step the economy increases its production capabilities and exhibits exponential growth. But this is a naïve view of the economy. An increase of complexity is the key ingredient of economic growth.

The study of economic complexity is not new. At the beginning of the twentieth century, the Austrian School of Economics introduced the idea, typical of complex systems, that order in market systems is a spontaneous, emerging property. As mentioned above, this idea was already present in Adam Smith's invisible hand that coordinates markets. The philosopher-economist Friedrick Hayek devoted much theoretical thinking to complexity and its role in economics.

More recently, research on economies as complex systems started in the 1980s at The Santa Fe Institute (Santa Fe, New Mexico). There, under the direction of the economist Bryan Arthur, researchers developed one of the first artificial economies. Some of the research done at the Santa Fe Institute is presented in three books titled The Economy as an Evolving Complex System , published by The Santa Fe Institute.

At the Massachusetts Institute of Technology (MIT), the Observatory on Economic Complexity gathers and publishes data on international trade and computes various measures of economic complexity, including the Economic Complexity Index (ECI) developed by Cesar Hidalgo and Ricardo Hausmann. Complexity economics is now a subject of research at many universities and economic research centers.

How can systems increase their complexity spontaneously, thereby evolving? Lessons from biology might help. Chaitin [ 12 ] proposed a mathematical theory based on the theory of algorithmic complexity that he developed to explain Darwinian evolution. Chaitin's work created a whole new field of study—metabiology—though his results are not universally accepted as proof that Darwinian evolution works in creating complexity.

While no consensus exists, and no existing theory is applicable to economics, it is nevertheless necessary to understand how complexity is created if we want to understand how economies grow or eventually fail to grow.

Assuming the role of complexity in creating economic growth and wealth, how do we compare the complexity of objects as different as pasta, washing machines and computers? And how do we measure complexity? While complexity can be measured by a number of mathematical measures, such as those of the algorithmic theory of complexity, there is no meaningful way to aggregate these measures to produce a measure of the aggregate output.

Mainstream economics uses price—the market value of output—to measure the aggregate output. But there is a traditional debate on value, centered on the question of whether price is a measure of value. A Marxist economist would argue that value is the amount of labor necessary to produce that output. We will stay within market economies and use price to measure aggregate output. The next section discusses the issues surrounding aggregation by price.

The Myth of Real Output

Aggregating so many (eventually rapidly changing) products 7 quantitatively by physical standards is an impossible task. We can categorize products and services, such as cars, computers, and medical services but what quantities do we associate to them?

Economics has a conceptually simple answer: products are aggregated in terms of price, the market price in free-market economies as mentioned above, or centrally planned prices in planned economies. The total value of goods produced in a year is called the nominal Gross National Product (GNP). But there are two major problems with this.

First, in practice, aggregation is unreliable: Not all products and services are priced; many products and services are simply exchanged or self-produced; black and illegal economies do exist and are not negligible; data collection can be faulty. Therefore, any number which represents the aggregate price of goods exchanged has to be considered uncertain and subject to error.

Second, prices are subject to change. If we compare prices over long periods of time, changes in prices can be macroscopic. For example, the price of an average car in the USA increased by an order of magnitude from a few thousand dollars in the 1950s to a tens of thousands of dollars in the 2010s. Certainly cars have changed over the years, adding features such as air conditioning, but the amount of money in circulation has also changed.

The important question to address is whether physical growth corresponds to the growth of nominal GNP. The classical answer is no, as the level of prices changes. But there is a contradiction here: to measure the eventual increase in the price level we should be able to measure the physical growth and compare it with the growth of nominal GNP. But there is no way to measure realistically physical growth; any parameter is arbitrary.

The usual solution to this problem is to consider the price change (increase or decrease) of a panel of goods considered to be representative of the economy. The nominal GNP is divided by the price index to produce what is called real GNP. This process has two important limitations. First, the panel of representative goods does not represent a constant fraction of the total economy nor does it represent whole sectors, such as luxury products or military expenditures. Second, the panel of representative goods is not constant as products change, sometimes in very significant ways.

Adopting an operational point of view, the meaning of the real GNP is defined by how it is constructed: it is the nominal GNP weighted with the price of some average panel of goods. Many similar constructions would be possible in function of different choices of the panel of representative goods. There is therefore a fundamental arbitrariness in how real GNP is measured. The growth of the real GNP represents only one of many different possible concepts of growth. Growth does exist in some intuitive sense, but quantifying it in some precise way is largely arbitrary. Here we are back to the fundamental issue that economies are complex systems.

Describing mathematically the evolution of the complexity of an economy is a difficult, perhaps impossible, task. When we aggregate by price, the problem becomes more tractable because there are constraints to financial transactions essentially due to the amount of money in circulation and rules related to the distribution of money to different agents.

But it does not make sense to aggregate by price the output of an entire country. We suggest that it is necessary to model different sectors and understand the flows of money. Some sectors might extend over national boundaries. Capital markets, for example, are truly international (we do not model them as purely national); the activity of transnational corporations can span a multitude of countries. What is required is an understanding of what happens under different rules.

Here we come upon what is probably the fundamental problem of economics: the power structure. Who has the power to make decisions? Studying human structures is not like studying the behavior of a ferromagnet. Decisions and knowledge are intertwined in what the investor George Soros has called the reflexivity of economies.

Finance, the Banking System, and Financial Crises

In neoclassical economics, finance is transparent; in real-world economies, it is far from being the case. Real economies produce complexity and evolve in ways that are difficult to understand. Generally speaking, the financial and banking systems allow a smooth evolution of the economic system, providing the money necessary to sustain transactions, thereby enabling the sale and purchase of goods and services. While theoretically providing the money needed to sustain growth, the financial and banking systems might either provide too little money and thereby constrain the economy, or provide too much money and thereby produce inflation, especially asset inflation.

Asset inflation is typically followed by asset deflation as described by Minsky [ 33 ] in his financial instability hypothesis. Minsky argued that capitalist economies exhibit asset inflations due to the creation of excess money, followed by debt deflations that, because of the fragile financial systems, can end in financial and economic crises. Since Minsky first formulated his financial instability hypothesis, many changes and additional analysis have occurred.

First, it has become clear that the process of money creation is endogenous, either by the central banks or commercial banks. What has become apparent, especially since the 2007–2009 financial crisis, is that central banks can create money greatly in excess of economic growth and that this money might not flow uniformly throughout the economy but follow special, segregated paths (or flows), eventually remaining in the financial system, thereby producing asset inflation but little to no inflation in the real economy.

Another important change has been globalization, with the free flow of goods and capital in and out of countries, in function of where it earns the highest returns or results in the lowest tax bill. As local economies lost importance, countries have been scrambling to transform themselves to compete with low-cost/low-tax countries. Some Western economies have been successful in specializing in added-value sectors such as financial services. These countries have experienced huge inflows of capital from all over the world, creating an additional push toward asset inflation. In recent years, indexes such as the S&P500 have grown at multiples of the nominal growth of their reference economies.

But within a few decades of the beginning of globalization, some of those economies that produced low-cost manufactured goods have captured the entire production cycle from design, engineering, manufacturing, and servicing. Unable to compete, Western economies started an unprecedented process of printing money on a large scale with, as a result, the recurrence of financial crashes followed by periods of unsustainable financial growth.

Studying such crises is a major objective of economics. ETH-Zurich's Didier Sornette, who started his career as a physicist specialized in forecasting rare phenomena such as earthquakes, made a mathematical analysis of financial crises using non-linear dynamics and following Minsky's financial instability hypothesis. Together with his colleague Peter Cauwels, Sornette and Cauwels [ 34 ] hypothesize that financial crises are critical points in a process of superexponential growth of the economy.

Artificial Economies

As discussed above, the mathematical analysis of complex system is difficult and might indeed be an impossible task. To overcome this problem, an alternative route is the development of agent-based artificial economies. Artificial economies are computer programs that simulate economies. Agent-based artificial economies simulate real economies creating sets of artificial agents whose behavior resembles the behavior of real agents.

The advantage of artificial economies is that they can be studied almost empirically without the need to perform mathematical analysis, which can be extremely difficult or impossible. The disadvantage is that they are engineered systems whose behavior depends on the engineering parameters. The risk is that one finds exactly what one wants to find. The development of artificial markets with zero-intelligence agents was intended to overcome this problem, studying those market properties that depend only on the trading mechanism and not on agent characteristics.

There is by now a considerable literature on the development of artificial economies and the design of agents. See Chakraborti et al. [ 20 ] for a recent review. Leigh Tesfatsion at Iowa State University keeps a site which provides a wealth of information on agent-based systems: http://www2.econ.iastate.edu/tesfatsi/ace.htm .

Evolution of Neoclassical Economics

Among classical economists, efforts are underway to bring the discipline closer to an empirical science. Among the “new” classical economists is David Colander, who has argued that the term “mainstream” economics does not reflect current reality because of the many ramifications of mainstream theories.

Some of the adjustments underway are new versions of DSGE theories which now include a banking system and deviations from perfect rationality as well as the question of liquidity. As observed above, DSGE models are a sort of complex system made up of many intelligent agents. It is therefore possible, in principle, to view complex systems as an evolution of DSGEs. However, most basic concepts of DSGEs, and in particular equilibrium, rational expectations, and the lack of interaction between agents, have to be deeply modified. Should DSGEs evolve as modern complex systems, the new generations of models will be very different from the current generation of DSGEs.

Conclusions

In this paper we have explored the status of economics as an empirical science. We first analyzed the epistemology of economics, remarking on the necessity to carefully analyze what we consider observations (e.g., volatility, inflation) and to pose questions that can be reasonably answered based on observations.

In physics, observables are processes obtained through the theory itself, using complex instruments. Physical theory responds to empirical tests in toto ; individual statements have little empirical content. In economics, given the lack of a comprehensive theory, observations are elementary observations, such as prices, and theoretical terms are related to observables in a direct way, without cross validation. This weakens the empirical content of today's prevailing economic theory.

We next critiqued neoclassical economics, concluding that it is not an empirical science but rather the study of an artificial idealized construction with little connection to real-world economies. This conclusion is based on the fact that neoclassical economics is embodied in DSGE models which are only weakly related to empirical reality.

We successively explored new ideas that hold the promise of developing economics more along the lines of an empirical science. Econophysics, an interdisciplinary effort to place economics on a sure scientific grounding, has produced a number of results related to the analysis of financial time series, in particular the study of inverse power laws. But while econophysics has produced a number of models, it has yet to propose a new global economic theory.

Other research efforts are centered on looking at economies as complex evolutionary systems that produce order and increasing complexity. Environmental constraints due to the accounting of energy and entropy are beginning to gain attention in some circles. As with econophysics, the study of the economy as a complex system has yet produced no comprehensive theory.

The most developed area of new research efforts is the analysis of instabilities, building on Hyman Minsky's financial instability hypothesis. Instabilities are due to interactions between a real productive economy subject to physical constraints, and a financial system whose growth has no physical constraints.

Lastly, efforts are also being made among classical economists to bring their discipline increasingly into the realm of an empirical science, adding for example the banking system and boundedly rational behavior to the DSGE.

Conflict of Interest Statement

The author declares that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

1. ^ By “neoclassical economics,” we refer to an economic theory based on the notions of optimization, the efficient market hypothesis, and rational expectations. Among the major proponents of neoclassical economic thinking are Robert Lucas and Eugene Fama, both from the University of Chicago and both recipients of the Nobel Prize in Economics. Because neoclassical economic (and finance) theory is presently the dominating theory, it is also often referred to as “mainstream” theory or “the prevailing” theory. Attempts are being made to address some of the shortfalls of neoclassical economics, such as the consideration of the banking system, money creation and liquidity.

2. ^ For example, in the Special Relativity Theory, the concept of simultaneity of distant events is not an a priori concept but depends on how we observe simultaneity through signals that travel at a finite speed. To determine simultaneity, we perform operations based on sending and receiving signals that travel at finite speed. Given the invariance of the speed of light, these operations make simultaneity dependent on the frame of reference.

3. ^ The term “metaeconomics” is currently used in a different sense. See Section Econophysics and Econometrics below. Here we use metaeconomics in analogy with Chaitin's metabiology.

4. ^ Power laws are ubiquitous in physics where many phenomena, such as the size of ferromagnetic domains, are characterized by power laws.

5. ^ Karl Menger was the son of the economist Carl Menger, the founder of the Austrian School of Economics.

6. ^ The Institute for New Economic Thinking (INET) is a not-for profit think tank whose purpose is to support academic research and teaching in economics “outside the dominant paradigms of efficient markets and rational expectations.” Founded in 2009 with the financial support of George Soros, INET is a response to the global financial crisis that started in 2007.

7. ^ Beinhocker [ 31 ] estimates that, in the economy of a city like New York, the number of Stock Keeping Units or SKUs, with each SKU corresponding to a different product, to be in the order of tens of billions.

1. Fabozzi FJ, Focardi SM, Jonas C. Investment Management: A Science to Teach or an Art to Learn? Charlottesville: CFA Institute Research Foundation (2014).

Google Scholar

2. van Orman Quine W. Two Dogmas of Empiricism (1951). Available online at: http://www.ditext.com/quine/quine.html )

3. Bridgman WP. The Logic of Modern Physics (1927). Available online at: https://archive.org/details/logicofmodernphy00brid

4. Hempel CG. Aspects of Scientific Explanation and Other Essays in the Philosophy of Science . New York: Free Press (1970).

5. Feyerabend P. Against Method . 4th ed. London: Verso (1975).

6. Kuhn TS. The Structure of Scientific Revolutions . 3rd ed. Chicago: University of Chicago Press (1962).

7. Popper C. Conjectures and Refutations: The Growth of Scientific Knowledge . London: Routledge Classics (1963).

8. Friedman M. Essays in Positive Economics . Chicago: Chicago University Press (1953).

9. Black F. Noise. J Finance (1953) 41 :529–43.

10. Anderson PW. More is different. Science (1972) 177 :393–6. doi: 10.1126/science.177.4047.393

PubMed Abstract | CrossRef Full Text | Google Scholar

11. Gu M, Weedbrook C, Perales A, Nielsen MA. More really is different. Phys D (2009) 238 :835–9. doi: 10.1016/j.physd.2008.12.016

CrossRef Full Text | Google Scholar

12. Chaitin G. Proving Darwin: Making Biology Mathematical. New York: Pantheon Books (2012).

13. von Neumann J. The Computer and the Brain . New Haven; London: Yale University Press (1958).

14. Turing A. Computing machinery and intelligence. Mind (1950) 49 :433–60. doi: 10.1093/mind/LIX.236.433

15. Shiller RJ. Rational Exuberance . 1st ed. Princeton: Princeton University Press (2015).

16. Muth JF. Rational expectations and the theory of price movements. Econometrica (1961) 29 :315–35. doi: 10.2307/1909635

17. Mantel R. On the characterization of aggregate excess demand. J Econ Theory (1974) 7 :348–53. doi: 10.1016/0022-0531(74)90100-8

18. Smets F, Wouters R. An estimated stochastic dynamic general equilibrium model of the euro area. In: Working Paper No. 171, European Central Bank Working Paper Series. Frankfurt (2002). doi: 10.2139/ssrn.1691984

19. Lux T. Applications of statistical physics in finance and economics. In: Kiel Working Papers 1425. Kiel: Kiel Institute for the World Economy (2008).

20. Chakraborti A, Muni-Toke I, Patriarca M, Abergel F. Econophysics review: I. empirical facts. Quant Finance (2011) 11 :991–1012. doi: 10.1080/14697688.2010.539248

21. Gallegati M, Keen S, Lux T, Ormerod P. Worrying trends in econophysics. Phys A (2006) 370 :1–6 doi: 10.1016/j.physa.2006.04.029

22. Mandelbrot B. Stable Paretian random functions and the multiplicative variation of income. Econometrica (1961) 29 :517–43. doi: 10.2307/1911802

23. Gabaix X, Gopikrishnan P, Plerou V, Stanley HE. A theory of power-law distributions in financial market fluctuations. Nature (2003) 423 :267–70. doi: 10.1038/nature01624

24. Lux T, Marchesi M. Scaling and criticality in a stochastic multi-agent model of a financial market. Nature (1999) 397 :498–500.

25. Cont R, Bouchaud J-P. Herd behavior and aggregate fluctuations in financial markets. Macroecon Dyn. (2000) 4 :170–96. doi: 10.1017/s1365100500015029

26. Marchenko VA, Pastur LA. Distribution of eigenvalues for some sets of random matrices. Mathematics (1967) 72 :507–536.

27. Plerou V, Gopikrishnan P, Rosenow B, Amaral LAN, Guhr T, Stanley HE. Random matrix approach to cross correlations in financial data. Phys Rev E (2002) 65 :066126-1—18. doi: 10.1103/physreve.65.066126

28. Burda Z, Jurkiewicz J, Nowak MA, Papp G, Zahed I. Levy matrices and financial covariances (2001). arXiv:cond-mat/0103108

29. Hall P, Soskice D. Varieties of Capitalism: The Institutional Foundations of Comparative Advantage . Oxford: Oxford University Press (2001). doi: 10.1093/0199247757.001.0001

30. Schumacher F. Small is Beautiful: A Study of Economics as Though People Mattered. London: Blond & Briggs Ltd. (1973). 352 p.

31. Beinhocker ED. The Origin of Wealth: The Radical Remaking of Economics and What It Means for Business and Society . Boston: Harvard Business Review Press (2007).

32. Georgescu-Roegen N. The Entropy Law and the Economic P5rocess . Cambridge: MA University Press (1971). doi: 10.4159/harvard.9780674281653

33. Minsky HP. Stabilizing an Unstable Economy . New Haven: Yale University Press (1986).

34. Cauwels P, Sornette D. The Illusion of the Perpetual Money Machine, Paper No. 12-40 . Geneva: Swiss Finance Institute Research (2012). Available online at: http://ssrn.com/abstract=2191509 (Accessed October 27, 2012).

Keywords: economics, empirical science, epistemology of economics, econophysics, complex systems, financial crises

Citation: Focardi SM (2015) Is economics an empirical science? If not, can it become one? Front. Appl. Math. Stat . 1:7. doi: 10.3389/fams.2015.00007

Received: 07 April 2015; Accepted: 29 June 2015; Published: 21 July 2015.

Reviewed by:

Copyright © 2015 Focardi. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY) . The use, distribution or reproduction in other forums is permitted, provided the original author(s) or licensor are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

*Correspondence: Sergio M. Focardi, Department of Applied Mathematics and Statistics, Stony Brook University, Stony Brook, NY 11794, USA, [email protected]

Disclaimer: All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article or claim that may be made by its manufacturer is not guaranteed or endorsed by the publisher.

logo

A Comprehensive Empirical Research Guide for Academics

gradecrest-hero-image

As the world advances, research becomes a critical component of every sector. Research is specifically required to systematically explore topics, confirm facts, and draw conclusions. One approach to research is empirical research.

Empirical research is a key pillar of scientific discovery, transforming ideas and questions into tangible and evidence-based insights. By gathering data through observation or experimentation, researchers can test hypotheses and uncover meaningful patterns in the real world.

If you are setting out to investigate human behavior, testing a new medical treatment, or analyzing market trends, empirical research allows you to draw reliable conclusions grounded in fact.

This guide walks you through the essential steps of designing and conducting a successful empirical study, helping you navigate the process with confidence and precision. We have gone all in to ensure that whether you are just starting out or looking to sharpen your research skills, you get all the information in this one resource. However, before we go deeper, let us look at the definition of empirical research.

What is Empirical Research?

Empirical research is a method of acquiring knowledge through direct observation or experience, typically involving data collection, analysis, and interpretation. The research process involves the use of data to investigate a hypothesis or answer a research question.

 The process requires direct or indirect observation and can be either quantitative (involving numerical data) or qualitative (focusing on non-numerical data such as interviews or observations). This research approach is often associated with the scientific method. However, it is applicable to a wide range of disciplines.

Empirical research contrasts theoretical research, which relies on models, opinions, or existing theories without the need for direct observation.

When conducting empirical research, researchers gather and study real, observable data, making it objective. It is worth noting that researchers publish empirical research in peer-reviewed articles.

Types of Methodologies used in Empirical Research

Empirical research can take various forms depending on the type of data collected and analyzed. Below are the most common types:

1. Quantitative Research

Quantitative empirical research focuses on collecting numerical data. Researchers collect, measure, and analyze the numerical data statistically. Qualitative research often involves large sample sizes to identify patterns, relationships, and generalizations.

  • Examples: A study measuring the effect of a new drug on blood pressure levels among a group of patients. Another example is a study on the correlation between the time spent on social media and academic performance.
  • Subjects that use it mostly: Often used in fields like medicine, economics, sociology, and psychology.

Quantitative research excels at measuring behavior, patterns, personal views, preferences, and other similar variables. Studies that are based on quantitative methods are more structured and the variables used are predetermined. Researchers use the data collected and analyzed in quantitative studies to answer empirical questions.

Quantitative research methods include longitudinal study, correlational research, causal-comparative research, survey, and experiment.

2. Qualitative Research

Qualitative empirical research gathers non-numerical data, such as interviews, focus groups, or observational notes. Qualitative research mainly involves understanding experiences, opinions, and motivations to get insights into human experiences.

Qualitative research mostly involves a small group of people and conversational methods to gather insights into a problem.

  • Example : An ethnographic study examining the social behaviors of a remote tribe.
  • Subjects : Common in anthropology, education, sociology, and nursing.

3. Mixed Methods Research

There are instances when using a single research method does not suffice to address the research question. In such cases, a researcher can use a mix of both qualitative and quantitative research methods.

A mixed-methods research combines both quantitative and qualitative research techniques to gain a more comprehensive understanding of a subject. It allows researchers to triangulate data from different perspectives.

  • Example: A study on the effectiveness of a training program, combining surveys (quantitative) and interviews (qualitative).
  • Subjects: Used across various fields, including education, psychology, and healthcare.

Steps for Conducting an Empirical Research

Whether you are conducting a quantitative study, a qualitative investigation, or using a mixed-methods approach, the steps outlined below offer a comprehensive guide to conducting empirical research effectively.

1. Identify the Research Problem, Question, or Hypothesis

The first and most crucial step in empirical research is identifying a clear, concise, and researchable problem, hypothesis, or question.

This step lays the foundation for the entire research process, guiding subsequent steps and determining the direction of the study.

A well-formulated research problem should be specific, relevant, and can be addressed through empirical methods, such as observation, measurement, and experimentation.

If you are formulating a research question, it should address a significant gap in the existing literature or offer practical solutions to real-world problems. Avoid vague or overly broad questions. Narrowing down the focus ensures that you can conduct the research within reasonable time and resource constraints. Additionally, the problem should be realistic in terms of access to data, subjects, or other resources. Here is an example of a specific question:

  • Broad Question: "What is the effect of technology on education?"
  • Specific Problem: "How does the use of virtual learning environments impact student engagement in high school science classes?"

Another good example for a research question-driven research is “How does religiosity impact mental health in immigrant adults in the UK.”

In some cases, the first step in empirical research can be formulation of a hypothesis than a problem or research question. The choice depends on whether the researcher begins with a specific prediction or an area of curiosity to explore. A hypothesis is a tentative, testable statement about the expected relationship between variables. For a hypothesis driven research, the researcher has to have a clear idea or theory about the outcome. It typically occurs in experimental or quantitative research. An example of hypothesis is, “Regular exercise decreases levels of anxiety in adults”.

It is worth noting that in hypothesis-driven research, the research question is often implicit, and the hypothesis serves as a more focused prediction to be tested.

  • How to develop a research question.
  • Steps for writing a problem statement section.

2. Literature Review and Theoretical Framework

A new empirical study does not exist in a vacuum. Instead, it heavily relies on what other researchers have already done and recommended further studies on or failed to address in their research. Therefore, the second step in an empirical inquiry is to read and study the existing research on your topic.

Before designing the study, it is essential to conduct a thorough literature review. This process involves examining existing research to understand what has already been studied in your area of interest. The literature review serves several purposes:

  • Identifying gaps in current knowledge.
  • Finding theoretical frameworks that can guide your study.
  • Helping refine your research question or hypothesis based on what is known.

The theoretical framework is the foundation that underpins your study. You can build the theoretical framework for your study from existing theories or models. The theoretical framework helps you explain how and why certain variables may relate to each other.

For example, if you are studying the effects of sleep deprivation on cognitive performance, your theoretical framework may be based on existing theories of cognitive load and performance degradation under fatigue.

3. Create a Hypothesis

Before commencing your research, you need to have a working hypothesis or a guess of what your probable result will be.

A hypothesis is a testable statement that predicts the outcome of your research. It specifies the expected relationship between the variables you are studying. In quantitative research, you frame the hypothesis in terms of cause-and-effect relationships, whereas in qualitative research, the hypothesis might be more open-ended, exploring trends or themes rather than causal links.

Types of Hypotheses:

  • Null Hypothesis (H0). Assumes that there is no relationship between the variables. Here is an example, “There is no significant difference in cognitive performance between individuals who sleep 8 hours a night and those who sleep 4 hours."
  • Alternative Hypothesis (H1). Suggests that there is a relationship between the variables. An example, "Individuals who sleep 8 hours a night perform significantly better on cognitive tasks than those who sleep 4 hours."

4. Research Design and Methodology

Once you have a clear hypothesis, the next step is to design the research. The research design outlines how you will collect and analyze data.

The design of your research directly influences the validity and reliability of your findings.

Depending on the type of research, you will need to select the appropriate research method

  • Quantitative Research. Often uses experimental or correlational designs. Data is collected through surveys, tests, or observations and analyzed using statistical methods.
  • Qualitative Research. Employs methods like interviews, focus groups, or participant observation. A researcher analyzes data thematically or through content analysis.
  • Mixed-Methods Research. Combines both quantitative and qualitative approaches to offer a more comprehensive understanding of the research problem.

When selecting a research strategy, it is important to consider serval factors including:

  • Sample Size . In quantitative research, a larger sample size generally leads to results that are more reliable. In qualitative research, smaller, more focused samples are often used.
  • Control and Experimental Groups . If conducting experimental research, you may need a control group (which does not receive the intervention) and an experimental group (which does).
  • Ethics . Ensure that your research design considers ethical principles, particularly if working with human or animal subjects. The ethics committee or the Institutional Review Board (IRB) of an institution must approve research that involves human subjects.

For a study on sleep deprivation, you might use a between-subjects design where one group sleeps 8 hours and another sleeps 4 hours, and then compare their cognitive performance through standardized tests.

5. Data Collection

After finalizing the research design, the next step is data collection. The method you select for data collection depends on whether your research is based on quantitative, qualitative, or mixed-methods. The goal during this phase is to gather data that will allow you to test your hypothesis or answer your research question.

Quantitative Data Collection Methods

  • Surveys and Questionnaires. Used to gather data from a large number of respondents. These are particularly useful for studies involving behavioral trends, attitudes, or opinions. With the advancement in technology, new survey tools have emerged, making the process seamless for researchers.
  • Experimentation. An experiment is a research method in which the researcher manipulates one or more independent variables to observe their effect on a dependent variable. This approach often involves controlled conditions and random assignment of participants to different groups (e.g., experimental and control groups). Experiments are widely used in scientific fields like psychology, medicine, and biology to test hypotheses and establish cause-and-effect relationships between variables. Conducted in controlled environments, experiments manipulate one or more variables to observe their effect on a dependent variable.
  • Observational method. Direct observation of behavior or phenomena in real-time. Observation is often used in behavioral sciences. It is often part of ethnographic research design such as archival research or survey. However, in qualitative research it is handy when observing measurable variables such as posture, eye color, age, scale, etc.
  • Causal-comparative research. This method is used to determine the causal relationship among variables. For example, it can be used to examine whether allowing employees to work at home increases or affects productivity.
  • Cross-sectional research. The cross-sectional research focuses on studying the similarity in all variables except the one under study. As an observational study, it is used to gather and analyze data from a population at a single point in time. Most researchers use it to measure prevalence of health outcomes, describe a population, and understand determinants of health. They can also be used in marketing and by pharmaceutical companies. Often, a cross-sectional study precedes a longitudinal study.
  • Longitudinal studies. The longitudinal study method is useful in understanding the behavior or traits of a subject under study after repeatedly testing the subject over a certain period. It involves collecting both qualitative and quantitative data. A common type is the cohort study .
  • Correlational research. A correlational research is used to determine the relationship and prevalence among variables. It is a type of non-experimental research method used to determine the relationship or association between two or more variables. Unlike experimental research, it does not manipulate variables but observes them in their natural state. The strength and direction of the relationship are measured using a correlation coefficient, typically ranging from -1 (a perfect negative correlation) to +1 (a perfect positive correlation), with 0 indicating no correlation. Correlational research is useful for identifying trends and patterns but cannot establish causality. It is commonly used in psychology, education, and social sciences to explore relationships between variables like behavior, attitudes, and outcomes.

Qualitative Data Collection Methods

  • Interview. One-on-one or group interviews to gather detailed insights into people’s experiences or opinions. Interview method is a form of conversational approach to gather in-depth data about a phenomenon. It is a common approach in social sciences and humanities.
  • Case Study. The case study approach is used to identify existing information by an in-depth analysis of existing cases. It allows researchers to obtain empirical evidence for investigating businesses or existing real-world problems. The researchers must do an empirical analysis to ensure that the parameters and variables in the current case are similar to the case at hand. Researchers can draw conclusions from the case study and answer research questions. They are useful in studying experience of groups, organizations, and geographic areas, etc.
  • Focus Groups. A moderated discussion with a group of individuals to explore attitudes or perceptions .
  • Participant Observation. The researcher immerses himself or herself in a group or community to observe behaviors, interactions, and cultural practices.
  • Textual analysis. This is a secondary data collection approach that entails describing, interpreting, and understanding textual content. Textual analysis helps elaborate patterns and trends of media content. The data can be used in determining for example customer preferences, buying habits, and dislikes. It is useful for campaign design by marketers or for product design by product teams.

6. Data Analysis

Once the data is collected, the next step is to analyze it. The methods of analysis will depend on whether the data is quantitative or qualitative.

If you collected quantitative data, here are some of the quantitative methods of data analyses to consider:

  • Descriptive Statistics. Used to summarize the data (e.g., mean, median, standard deviation).
  • Inferential Statistics. Techniques like regression analysis, t-tests, or ANOVA are used to determine whether the observed effects are statistically significant.

If your study was qualitative, consider the following qualitative data analysis approaches:

  • Thematic Analysis. Involves identifying, analyzing, and reporting themes or patterns within the data.
  • Content Analysis. Focuses on interpreting the content of textual or visual data.

For example: In a sleep deprivation study, quantitative analysis might involve comparing test scores between the two groups using a t-test, while qualitative analysis could involve identifying themes from interview transcripts regarding participants’ emotional states after sleep loss.

7. Interpret Results and Test Hypothesis

After analyzing the data, interpret the results to determine whether they support or refute your hypothesis. In quantitative research, this typically involves looking at the statistical significance of the results. In qualitative research, it involves understanding the broader meanings or implications of the themes identified.

Let us look at a few examples:

  • If the t-test reveals that there is a significant difference in cognitive performance between sleep-deprived and well-rested individuals, you can conclude that sleep deprivation negatively affects cognitive performance.
  • If interviews reveal recurring themes of frustration and difficulty concentrating among sleep-deprived individuals, these findings provide valuable context for understanding how sleep loss impacts daily life.

8. Report your Findings

Once the research is complete, the final step is to report your findings. This involves writing up the study in a research paper or report, typically following a standard structure:

Components of a Scientific Research Paper:

  • Introduces the research problem, background, and hypothesis.
  • Describes how the study was conducted, including the research design, participants, and data collection methods.
  • Presents the findings of the study, often with tables, charts, or graphs in quantitative research.
  • Interprets the results, discussing their implications and how they relate to existing research.
  • Summarizes the key findings and suggests directions for future research.

Read our guide on formatting a scientific paper .

For a study on sleep deprivation, the report might include statistical results showing the decline in cognitive performance and a discussion of how these findings could influence recommendations for work schedules, student study habits, or public health policies.

You can then disseminate your findings by making abstract posters for conferences, submitting a manuscript for peer-review and publication to a relevant journal, or attending seminars to disseminate your findings and knowledge.

Related Readings:

  • Number of references for a research paper.
  • How to write a nursing research paper.
  • Setting the cover page for a research paper.
  • How to write a research paper.

Pros and Cons of Empirical Research

Empirical research is objective and thus valuable in many fields. However, it also has disadvantages.

Here are the advantages of empirical research:

  • Objective and reliable. Empirical research is based on actual data, which makes it more reliable than other forms of research based solely on theory or speculation. Because it is grounded in observation, it is also less prone to biases. Empirical research is useful in validating previous research findings, theories, and frameworks.
  • High level of control. Empirical research offers a high level of control, allowing researchers to manipulate independent variables while keeping others constant. This helps in accurately identifying cause-and-effect relationships and minimizing confounding variables that could skew results. Controlled settings, such as laboratories, provide the opportunity for precise measurement and replication, ensuring reliable and valid outcomes. Random assignment in experimental designs further reduces bias by ensuring comparable groups. This control enhances the internal validity of studies, making it easier to attribute observed effects to the variables being tested, and leading to more accurate and credible research findings.
  • Replicable. Empirical studies can often be replicated by other researchers, which enhances the validity of the results. For instance, a study on a drug's effects can be replicated in different populations to confirm its generalizability.
  • Real-world applications. Empirical research often has practical applications because it focuses on real-world data. For example, it can be used to develop policies, improve business practices, or guide clinical interventions.
  • Data-driven. Because the research is based on actual data, it provides actionable insights. In contrast to purely theoretical models, empirical research can offer tangible results, particularly useful in applied fields like healthcare and business.

Despite having many advantages, empirical research is never perfect. Here are some of its drawbacks:

  • Time-consuming. Collecting and analyzing empirical data can be time-intensive. Fieldwork, experiments, or extensive surveys may take weeks or months to complete.
  • Costly. Empirical research often requires significant financial resources, particularly if it involves specialized equipment, long-term studies, or large sample sizes.
  • Ethical issues. In certain types of empirical research, particularly in social and medical fields, there may be ethical concerns. For example, experiments involving human subjects need to follow strict ethical guidelines.
  • Limitations in Scope. Empirical research is often limited to specific settings, populations, or contexts, which can make it hard to generalize findings to broader scenarios.

Tips for a Successful Empirical Research

Any researcher that sets out for a scientific inquiry hopes to get to the depth of it. While following the steps above can help you get there, consider the tips below to have a bulletproof empirical research journey.

  • Choose the right research method. Picking the appropriate research method (quantitative, qualitative, or mixed) is crucial. Quantitative methods are best for testing hypotheses with numerical data, while qualitative methods are ideal for exploring complex, contextual questions.
  • Do a pilot testing. Before fully committing to a research project, it is advisable to run a pilot test. This smaller-scale trial helps identify any potential issues in data collection methods or instrumentation.
  • Use reliable tools. Ensure that your data collection tools (surveys, tests, observation sheets) are reliable and valid. Using established tools or validating new tools is essential for the credibility of your research.
  • Maintain objectivity. One of the key strengths of empirical research is its objectivity, so it is essential to minimize biases. Standardizing procedures and having clear, predefined criteria for analysis can help maintain objectivity.
  • Ensure you are strict on ethics. Obtain informed consent, maintain confidentiality, and minimize harm to subjects. Adhering to ethical standards is crucial in empirical research, particularly in studies involving human or animal subjects.
  • Consider collaboration and peer review. Collaborating with other researchers can offer different perspectives and reduce the risk of biases. Submitting your research for peer review also ensures that other researchers scrutinize the methodology and findings, thereby strengthening the study’s credibility.
  • Be clear and concise. When reporting empirical research, clarity is key. Use straightforward language and avoid jargon to ensure that your findings are accessible to a broader audience.
  • Provide context. Always provide context for your findings. For instance, if you are researching customer satisfaction in a specific industry, explain why this topic is important and how your research contributes to existing knowledge.

Tips for Writing a Great Empirical Writing

Empirical writing is the epitome of empirical research. After researching, researchers disseminate empirical findings by writing reports, research papers, or articles. Empirical writing follows a specific format, with each section playing a significant role. Since it comes after the actual research, here are the steps and tips:

  • Give a background of your study. Introduce your research by providing a background of the problem. Introduce the research question, explaining why it is significant. You should then state the purpose and objectives of your study. Explain what you were finding out or proving. Given that the information is public-oriented, avoid using scientific jargon. Instead, use simple language.
  • Have a specific literature review. In this part, refine your literature review. Report on the gaps, contradictions, and areas that need further exploration that you identified earlier during the research. The literature review should provide context and justification for your research. Set the stage for your study by limiting your choice to studies that explain why your research is important.
  • Clearly Explain the Methods. Since you are reporting, talk about how you did the research. Explain in details the rationale for choosing a certain method over the other. You should also support your choice using evidence from already published literature. Highlight your research plan including the inclusion and exclusion strategies, sampling strategies, an every detail on the methods. Doing so helps improve the credibility of your study. Highlight any challenges you faced when collecting or analyzing the data. Be specific so that another researcher could replicate your study.
  • Share the results. You should share your empirical findings in a simple format. Present the findings objectively, without interpretation. Use tables, charts, or graphs and other visuals to illustrate the data.
  • Expound on the findings. This is the part where you discuss your research results. You interpret the results to draw meanings and conclusions. Interpret the findings, explaining their significance in relation to the hypothesis and research question. You should also discuss the limitations of the study and suggest areas for future research. If you had challenges, do not be shy to accommodate them in this section.
  • Wind up the paper. Finally, conclude your empirical paper with a summary of your findings, context, and its significance. Remind your readers why the study matters. You should also highlight the broader implication of your study. Ensure that you tie your findings back to the research question or hypothesis.
  • Edit and Proofread. Editing and proofreading your paper makes it perfect or near perfect. Review the paper to ensure that you have followed the correct structure. Ensure that you have listed all sources cited in the paper in the references section. Ensure that you follow the required citation style (e.g., APA , Harvard , MLA, Chicago, AMA , Turabian, ASA, IEEE, etc.). Review the paper for clarity, consistency, and adherence to academic writing standards and style. Finally, check for grammatical errors and overall flow of ideas.

Examples of Use Cases for Empirical Research

Empirical research plays a vital role in diverse fields, offering a reliable means to test hypotheses, validate theories, and gain insights from real-world data. For instance, it helps in investigating and improving current theories, developing new theories, and growing knowledge across different areas.

In addition, since it focuses on objectivity, the research findings are reliable, making it a go-to approach to scientific inquiry in fields such as economics, public policy, psychology, sociology, nursing, and medicine.

Here are some examples:

  • Medicine. Clinical trials, such as testing a new drug’s efficacy, are quintessential examples of empirical research in medicine. The data collected from these trials directly inform medical practice and policy.
  • Psychology. Experiments that measure human behavior, like studies on memory retention or stress response, often involve empirical methods.
  • Economics. Economists use empirical research to test models on market behavior, consumer trends, or the impact of government policies.
  • Education. Empirical studies in education might involve classroom observations or surveys to assess the effectiveness of teaching methods.
  • Environmental Science. Empirical research in this field often includes observing natural phenomena, conducting fieldwork, and testing hypotheses related to climate change, biodiversity, or conservation efforts.

Related Articles:

We have published, in our blog, other articles that could interest you. Please take time to read them:

  • What is a PhD Concept Paper?
  • Tools to help you manage citations and references.
  • Formulating a theoretical framework.
  • How to write a great informational report.
  • Research paper outline example.

Empirical research serves as a cornerstone of knowledge across various fields, providing a systematic way to observe, collect, and analyze data. Its ability to offer objective, real-world insights makes it indispensable in science, medicine, economics, humanities, and social sciences.

While it comes with challenges like time and cost constraints, the advantages of data-driven, replicable findings far outweigh the downsides.

For students who want to work with empirical research articles, carefully evaluate the methods and results sections of an article. Empirical research articles include the sections and explicitly state the methodologies and share the results. Meta-analysis, literature reviews, editorials/letters. Book reviews, opinions, and case studies are not empirical. You can always use controls on the databases such as filtering only evidence-based practice articles to get empirical research papers. Alternatively, use keywords such as empirical research, quantitative method, qualitative method, survey, ethnography, fieldwork or other type of empirical research method.

We are a research paper writing service with professional writers in every field to help you write your papers. If you need help, do not hesitate to place your order and get a plagiarism-free paper. We do NOT use AI when writing the papers on our website.

NB: All links lead to the respective external sources in a new tab.

  • Conducting empirical research - Emerald Publishing.
  • Empirical Research in the Social Sciences and Education.
  • Empirical Research: Defining, Identifying, and Finding.
  • Empirical vs. Non-Empirical Research.
  • How to write empirical research papers – James Madison University.
  • Identifying empirical research.
  • Quantitative and Empirical Research vs. Other Types of Research.
  • Writing an Empirical Paper in APA Style.

gradecrest-logo

Gradecrest is a professional writing service that provides original model papers. We offer personalized services along with research materials for assistance purposes only. All the materials from our website should be used with proper references. See our Terms of Use Page for proper details.

paypal logo

person with large bass drum instrument on their back

Professor Card was awarded the 2021 Nobel Prize for his empirical contributions to labor economics

what is empirical research economics

​Congratulations to the Class of 1950 Professor of Economics David Card, the 6th Nobel for Berkeley Economics. Professor Card, who shares the award with two other winners, was awarded the 2021 Nobel Prize for his empirical contributions to labor economics. Read more

Empirical Research

  • Living reference work entry
  • First Online: 22 May 2017
  • Cite this living reference work entry

what is empirical research economics

  • Emeka Thaddues Njoku 2  

553 Accesses

1 Citations

The term “empirical” entails gathered data based on experience, observations, or experimentation. In empirical research, knowledge is developed from factual experience as opposed to theoretical assumption and usually involved the use of data sources like datasets or fieldwork, but can also be based on observations within a laboratory setting. Testing hypothesis or answering definite questions is a primary feature of empirical research. Empirical research, in other words, involves the process of employing working hypothesis that are tested through experimentation or observation. Hence, empirical research is a method of uncovering empirical evidence.

Through the process of gathering valid empirical data, scientists from a variety of fields, ranging from the social to the natural sciences, have to carefully design their methods. This helps to ensure quality and accuracy of data collection and treatment. However, any error in empirical data collection process could inevitably render such...

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Institutional subscriptions

Bibliography

Bhattacherjee, A. (2012). Social science research: Principles, methods, and practices. Textbooks Collection . Book 3.

Google Scholar  

Comte, A., & Bridges, J. H. (Tr.) (1865). A general view of positivism . Trubner and Co. (reissued by Cambridge University Press , 2009).

Dilworth, C. B. (1982). Empirical research in the literature class. English Journal, 71 (3), 95–97.

Article   Google Scholar  

Heisenberg, W. (1971). Positivism, metaphysics and religion. In R. N. Nanshen (Ed.), Werner Heisenberg – Physics and beyond – Encounters and conversations , World Perspectives. 42. Translator: Arnold J. Pomerans. New York: Harper and Row.

Hossain, F. M. A. (2014). A critical analysis of empiricism. Open Journal of Philosophy, 2014 (4), 225–230.

Kant, I. (1783). Prolegomena to any future metaphysic (trans: Bennett, J.). Early Modern Texts. www.earlymoderntexts.com

Koch, S. (1992). Psychology’s Bridgman vs. Bridgman’s Bridgman: An essay in reconstruction. Theory and Psychology, 2 (3), 261–290.

Matin, A. (1968). An outline of philosophy . Dhaka: Mullick Brothers.

Mcleod, S. (2008). Psychology as science. http://www.simplypsychology.org/science-psychology.html

Popper, K. (1963). Conjectures and refutations: The growth of scientific knowledge . London: Routledge.

Simmel, G. (1908). The problem areas of sociology in Kurt H. Wolf: The sociology of Georg Simmel . London: The Free Press.

Weber, M. (1991). The nature of social action. In W. G. Runciman (Ed.), Weber: Selections in translation . Cambridge: Cambridge University Press.

Download references

Author information

Authors and affiliations.

Department of Political Science, University of Ibadan, Ibadan, Oyo, 200284, Nigeria

Emeka Thaddues Njoku

You can also search for this author in PubMed   Google Scholar

Corresponding author

Correspondence to Emeka Thaddues Njoku .

Editor information

Editors and affiliations.

Rhinebeck, New York, USA

David A. Leeming

Rights and permissions

Reprints and permissions

Copyright information

© 2017 Springer-Verlag GmbH Germany

About this entry

Cite this entry.

Njoku, E.T. (2017). Empirical Research. In: Leeming, D. (eds) Encyclopedia of Psychology and Religion. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-27771-9_200051-1

Download citation

DOI : https://doi.org/10.1007/978-3-642-27771-9_200051-1

Received : 01 April 2017

Accepted : 08 May 2017

Published : 22 May 2017

Publisher Name : Springer, Berlin, Heidelberg

Print ISBN : 978-3-642-27771-9

Online ISBN : 978-3-642-27771-9

eBook Packages : Springer Reference Behavioral Science and Psychology Reference Module Humanities and Social Sciences Reference Module Business, Economics and Social Sciences

  • Publish with us

Policies and ethics

  • Find a journal
  • Track your research

IMAGES

  1. Empirical Research: Definition, Methods, Types and Examples

    what is empirical research economics

  2. Empirical Research: Definition, Methods, Types and Examples

    what is empirical research economics

  3. What Is Empirical Research? Definition, Types & Samples

    what is empirical research economics

  4. What is Empirical Research? Definition, Types, and More

    what is empirical research economics

  5. What Is Empirical Research? Definition, Types & Samples

    what is empirical research economics

  6. What is Empirical Research? Definition, Types, and More

    what is empirical research economics

VIDEO

  1. Economics of Patents

  2. 05 2015 How to do empirical research in Economics

  3. Uniswap

  4. Empirical Research Methods for Human-Computer Interaction

  5. Is Inflation the Product of Greed?

  6. What is empirical research?

COMMENTS

  1. What Is Empirical Research? Definition, Types & Samples in 2024

    Empirical research is defined as any study whose conclusions are exclusively derived from concrete, verifiable evidence. The term empirical basically means that it is guided by scientific experimentation and/or evidence. Likewise, a study is empirical when it uses real-world evidence in investigating its assertions.

  2. Empirical Research: Definition, Methods, Types and Examples

    Empirical research: Definition Empirical research is defined as any research where conclusions of the study is strictly drawn from concretely empirical evidence, and therefore "verifiable" evidence. This empirical evidence can be gathered using quantitative market research and qualitative market research methods.

  3. Home

    Empirical Economics publishes high-quality papers that apply advanced econometric or statistical methods to confront economic theory with observed data. Exemplary topics are treatment effect estimation, policy evaluation, forecasting, and econometric methods. Contributions may focus on the estimation of established relationships between ...

  4. 3 Economics: Methods, approaches, fields and relevant questions

    3.1 Economic theory and empirical work: What is it? What is economic theory and what can it do? Unlike "theory" in some other social science disciplines, economic theory is mostly based on mathematical modelling and rigorous proof that certain conclusions or results can be derived from certain assumptions. But theory alone can say little about the real world. In Economics: Models = Theory ...

  5. Empirical Strategies in Economics: Illuminating the Path From Cause to

    The view that empirical strategies in economics should be transparent and credible now goes almost without saying. By revealing for whom particular instrumental variables (IV) estimates are valid, the local average treatment effects (LATE) framework helped make this so. This lecture uses empirical examples, mostly involving effects of charter and exam school attendance, to illustrate the value ...

  6. The Empirical Economist's Toolkit: From Models to Metho

    While historians of economics have noted the transition toward empirical work in economics since the 1970s, less understood is the shift toward \quasi-experimental" methods in applied microeconomics. Angrist and Pischke (2010) trumpet the wide application of these methods as a \credibility revolution" in econometrics that has nally provided persuasive answers to a diverse set of questions ...

  7. Introduction

    We also noted that his research always had a fine balance between theoretical innovation and empirical relevance, which is what is valued at Empirical Economics.

  8. Empirical Research and the Development of Economic Science

    Founded in 1920, the NBER is a private, non-profit, non-partisan organization dedicated to conducting economic research and to disseminating research findings among academics, public policy makers, and business professionals.

  9. Aims and scope

    Aims and scope. Empirical Economics publishes high quality papers using econometric or statistical methods to fill the gap between economic theory and observed data. Papers explore such topics as estimation of established relationships between economic variables, testing of hypotheses derived from economic theory, treatment effect estimation ...

  10. Methods Used in Economic Research: An Empirical Study of ...

    The methods used in economic research are analyzed on a sample of all 3,415 regular research papers published in 10 general interest journals every 5th year from 1997 to 2017. The papers are classified into three main groups by method: theory, experiments, and empirics. The theory and empirics groups are almost equally large. Most empiric papers use the classical method, which derives an ...

  11. Empirical Research: Defining, Identifying, & Finding

    Empirical research methodologies can be described as quantitative, qualitative, or a mix of both (usually called mixed-methods). Ruane (2016) (UofM login required) gets at the basic differences in approach between quantitative and qualitative research: Quantitative research -- an approach to documenting reality that relies heavily on numbers both for the measurement of variables and for data ...

  12. Empirical research

    A scientist gathering data for her research. Empirical research is research using empirical evidence. It is also a way of gaining knowledge by means of direct and indirect observation or experience. Empiricism values some research more than other kinds. Empirical evidence (the record of one's direct observations or experiences) can be analyzed ...

  13. An empirical turn in economics research

    An empirical turn in economics research. A table of results in an issue of the American Economic Review. Over the past few decades, economists have increasingly been cited in the press and sought by Congress to give testimony on the issues of the day. This could be due in part to the increasingly empirical nature of economics research.

  14. PDF How to Write a Research Paper in Economics

    Most research papers in economics are theoretical, empirical, or both. The type of paper you write will depend on the question you are interested in, and whether data has been, or could be, collected on your topic.

  15. PDF Experimental Economics What Is It? An Introduction to

    American Eco-Journal of Political Economy JPE Quarterly Journal of Economics QJE Review of Economic Studies RES Economic Journal EJ Games and GEB Journal of Economics, Behavior and Organization JEBO Source

  16. What is Empirical Research? Definition, Methods, Examples

    Empirical research is the cornerstone of scientific inquiry, providing a systematic and structured approach to investigating the world around us. It is the process of gathering and analyzing empirical or observable data to test hypotheses, answer research questions, or gain insights into various phenomena.

  17. Articles

    Empirical Economics publishes high-quality papers that apply advanced econometric or statistical methods to confront economic theory with observed ...

  18. Is there really an empirical turn in economics?

    The idea that economics has recently gone through an empirical turn -that it went from theory to data - is all over the place. Economists have been trumpeting this transformation on their blogs in the last few years, more rarely qualifying it. It is now showing up in economists' publications. Not only in articles by those who claim to ...

  19. Is economics an empirical science? If not, can it become one?

    If economics is to become an empirical science, we cannot accept terms such as volatility, inflation, growth, recession, consumer confidence, and so on without carefully defining them: the epistemology of economics has to be clarified. We will subsequently discuss why we argue that neoclassical economic and finance theory is not an empirical ...

  20. How to Conduct an Empirical Research (Best Guide)

    Economics. Economists use empirical research to test models on market behavior, consumer trends, or the impact of government policies. Education. Empirical studies in education might involve classroom observations or surveys to assess the effectiveness of teaching methods.

  21. Professor Card was awarded the 2021 Nobel Prize for his empirical

    Congratulations to the Class of 1950 Professor of Economics David Card, the 6th Nobel for Berkeley Economics. Professor Card, who shares the award with two other winners, was awarded the 2021 Nobel Prize for his empirical contributions to labor economics.

  22. Challenges in Empirical Research in Economics: The Way Forward

    In future empirical research in economics, studies applying these methods are likely to occupy an important position. The importance of the positions they will occupy depends on the significance of the research questions answered by these methods.

  23. Energy Use and Economic Growth: An Empirical Study of Short-Run and

    This research aims to study the relationship between energy use and economic growth between 1990 and 2021 in Colombia. Energy is a key element for economic growth and to improve people's living standards, especially in developing economies like Colombia.

  24. A Study on the Evaluation and Determinants of Green Economic Efficiency

    Within this context, this research investigates the economic efficiency under the constraints of energy and environment with respect to the current reality in China. The main contributions are reflected in the following areas. Firstly, the conceptual distinction between economic growth and economic efficiency is barely made in some studies.

  25. Empirical Research

    In empirical research, knowledge is developed from factual experience as opposed to theoretical assumption and usually involved the use of data sources like datasets or fieldwork, but can also be based on observations within a laboratory setting. Testing hypothesis or answering definite questions is a primary feature of empirical research.

  26. Violence Against Parents by Adult Children: A Systematic Literature

    In the analysis, all research, including nonrepresentative (87.2%) and representative (5.1%) samples, was included, noting that 7.7% of the research did not include information about the sampling. The results of the systematic literature review of empirical studies addressed our research questions using the PEO framework (Bettany-Saltikov, 2012 ...