Table of Contents
Like many of us, I am both amazed and cautious about the advancements in AI technology. Recently, I have started using AI tools to generate research ideas, explore new areas of study, and find relevant literature. One tool that caught my attention is Elicit, an AI research assistant developed to automate parts of researchers’ workflows, particularly in the field of Literature Review. Today, let’s delve into the features and functions of Elicit and see how it can revolutionize the way we conduct literature reviews.
What is Elicit?
Elicit is a research assistant powered by language models like GPT-3. It aims to make researchers’ lives easier by automating various aspects of their workflow. Currently, Elicit’s primary function is to assist with Literature Reviews. When researchers ask a question, Elicit presents relevant papers along with key information summaries in a user-friendly table. According to one of its developers, Elicit saves researchers an average of 1.4 hours per week.
What Can Elicit Do?
Elicit offers several valuable features to assist researchers:
- Quickly Locate Papers: Elicit helps researchers find papers on specific research topics swiftly.
- Analyze and Organize Papers: Researchers can analyze and organize multiple papers, including their own PDFs, using Elicit.
- Summarize Key Evidence: Elicit summarizes evidence from highly cited papers on a research topic.
- Brainstorm Research Questions: Elicit aids researchers in exploring and brainstorming research questions.
- Identify Search Terms: Researchers can rely on Elicit to identify relevant search terms for their literature reviews.
- Define Terms: Elicit provides definitions for terms related to the research topic.
- Refine Research Direction: Elicit helps researchers narrow down or adjust their research direction for optimal results.
My Experience Using Elicit
I recently used Elicit to develop a literature review for a study on the use of ChatGPT-4 in academic writing development. Here’s a detailed account of my experience using Elicit’s basic features and functions to identify relevant literature.
Features and Functions
Upon opening Elicit, I was prompted to enter a research question. Alternatively, I could have chosen to run Elicit over my own collection of papers, but since I didn’t have any papers at the time, I skipped this step. My question was, “What is the utility of ChatGPT for academic writing development?”
Elicit swiftly directed me to a page displaying a table with two columns: Paper title and abstract summary. Each row in the table included details such as paper title, authors, journal/source, year published, number of citations, and a DOI link. Some journals even had an impact factor graph, accessible by clicking on the bar graph icon. This feature is particularly useful for new researchers unfamiliar with reputable journals.
On the left-hand side of the screen, Elicit provided a “Summary of the top 4 papers” in beta testing mode. Below the summary, there was an option to add additional columns, allowing me to customize the table with information like intervention, outcomes measured, number of participants, and detailed study design. Other optional columns included metadata, population studied, intervention studied, result, and methodology.
Additional features were available on the top right-hand side of the screen:
- Has PDF: A toggle to filter out papers without full-text PDFs. Although I prefer viewing all papers, I chose this filter to compare the information provided by Elicit to the full-text paper.
- Filter: This feature allows users to filter papers by entering relevant keywords within the abstract, published after a specific date, or study types such as RCT, Review, Systematic Review, Meta-Analysis, and Longitudinal. I entered the keyword “ChatGPT” to include only papers mentioning it in the abstract.
- Sort by: Papers can be sorted by title, abstract summary, PDF availability, year, or number of citations. Search results can be exported into a CSV or bib file. I sorted the papers by citations to start with the most cited papers.
Search Results
Although the number of returned papers was not immediately clear, the “top 4 papers” summary automatically updated once I applied the filters. The summary provided a glimpse into the papers, mentioning mixed findings on the utility of ChatGPT for academic writing development. Some papers suggested that ChatGPT helps with search and discovery, reference and information services, cataloging and metadata generation, content creation, and accelerating research article drafting. However, there were also arguments highlighting the limitations of ChatGPT in scientific and academic contexts, as well as discussions on cybersecurity when using ChatGPT for medical information.
While I couldn’t confirm the accuracy of the summary without reviewing the articles, if correct, it was an incredibly useful and time-saving feature.
Clicking on a paper allowed me to view detailed information in a pop-up window, including an abstract summary, what the authors tested, outcomes measured, participants, trustworthiness, possible critiques, and other citations. Each section had clickable links, allowing me to compare the provided information with the paper itself. I noticed that some tables and excerpts in the paper were not visible in the Elicit preview. This brought to my attention the need to double-check the accuracy of the information provided by Elicit.
Key Takeaways
Based on my experience, here are a few key takeaways using Elicit for literature reviews:
- Elicit expedites the literature search process by swiftly identifying and summarizing relevant papers, even when they don’t precisely match entered keywords.
- It can search multiple databases, including scholarly journals and conference proceedings, providing a comprehensive research overview.
- Elicit streamlines paper organization by allowing customization of the table with desired information, making it easier to manage and track relevant papers during the writing process.
- It produces summaries of the most-cited papers on a particular research question, providing a quick overview of key findings.
- However, it is essential to keep in mind a few limitations. The exact number of papers returned by the search is not reported, potentially missing relevant sources. Additionally, Elicit cannot access full-text papers behind paywalls, limiting the available sources. Also, it is important to double-check the information provided by Elicit for accuracy, as there might be instances where misinterpretations occur.
Undoubtedly, Elicit shows great promise as a valuable tool for literature searching. However, it is recommended to use it alongside other reliable literature databases to ensure a comprehensive search. My experience with Elicit has been positive, and I’m excited to explore its potential further in future research.
Stay tuned for more insights into the use of Elicit and other AI tools to enhance your research and writing experience.