Empirical Librarians Conference
The following sessions were accepted for presentation at Empirical Librarians 2019. This list includes both lightning and concurrent sessions.
Jump to Lightning Talks Abstracts
Concurrent Sessions Abstracts
Track 1: “Supporting patrons’ original research” track
Tweets Or It Didn’t Happen: Supporting Scholarship in the Age of Twitter
Christine Anne George, Benjamin N. Cardozo School of Law
If an article is written and there’s no mention of it on Twitter, does it even exist? Who should be tweeting about faculty scholarship? (Hint: it shouldn’t just be institutional accounts.) This session will address the importance (and challenges) of using Twitter to draw people to scholarship, from handling technical difficulties to adapting thoughts to 240-character snippets to navigating a flame war. For academics who are accustomed to traditional publishing with footnotes and editing, Twitter is a culture shock. However, there are different levels of engagement on Twitter, and there is no one set way for academics to use the platform. In this session I’ll discuss my outreach strategy to my faculty, the trainings I organized, and the progress we’ve made.
Counting Beans (or Citations): Assessing Project Impact Through Citation Collections
Hannah Blanco, Oak Ridge National Laboratory
The NASA-funded Oak Ridge National Laboratory Distributed Active Archive Center (ORNL DAAC) curates and archives data products generated by NASA’s Earth Science Missions. As part of data discovery and project assessment, ORNL DAAC staff have been collecting citations of articles that use data from the archive. This is accomplished through monitoring the Digital Object Identifier associated with each dataset. This is a time-consuming process that requires the dedicated attention of a member of staff for a large portion of their time, however it continues to be a vital process as the citations collected are used as a metric for data usage and the impact the data center has on science. In previous years this citation collection process has taken a month or longer to locate, validate, and upload the citations into the ORNL DAAC’s database. As researchers are educated about the importance of citing datasets as well as additional resources and literature, the number of citations increase and so does the amount of time the process takes. An attempt at automating this process was recently undertaken to mitigate the time staff need to dedicate to the citation collection process. Reported here are the challenges, opportunities, and conclusions that were developed from that experience.
Collaborating with Researchers to Support Systematic Reviews
John Cyrus and Roy Brown, Virginia Commonwealth University
As the volume of published and unpublished research continues to grow across disciplines, evidence synthesis plays an important role in how academics, practitioners, and professionals take in new information and apply it to their work. Systematic reviews, a form of evidence synthesis most associated with the health sciences, provide a rigorous methodological framework under which to conduct evidence synthesis projects. The systematic review methodology has been adopted by numerous fields over the past decade, and librarians are increasingly being asked to provide support for what can be time-consuming and complex projects. This session will provide an overview of the systematic review methodology using examples from a health sciences systematic review program that has produced more than 15 published systematic reviews as well as other bodies of literature, including the process, the search, and data management. In addition, the session will examine the process of collaborating with student and faculty researchers throughout the project.
Track 2: “Performing research in libraries” track
Determining Value using COUNTER reports and Correcting for Trivial Use
Alfred Andrew Fry, Villanova University
Preliminary abstract:
I have been gathering usage data for electronic resources for more than 10 years, focusing on journals and journal packages. I believe that the best measure of use for a journal is the annual total number of successful full-text article requests. This can be obtained from many journal publishers using COUNTER JR1 reports. I calculate what I call cost per use by using the price we pay for each journal or package and dividing it by the number of successful requests. I believe this is a common practice. This data can be used to answer many questions. For example, is one subject area consistently more or less expensive than another? I am trying to collect data in a way that maximizes the ability to answer questions including unforeseen questions.
My ultimate goal is to determine an acceptable price range for all resources. With enough data, I can determine an acceptable range for cost per use which may vary by subject area or other factors. Using the acceptable cost per use range and the historical use, I can determine the acceptable price range and compare it to the actual price. Ultimately, this data would be presented in a visual format that would make sense to someone who does not understand the underlying calculations.
However, there is a significant problem with using the number of successful full-text article requests to measure use. This number includes what I call trivial use. For example, a student could open the full-text because the title or abstract looked interesting, skim the article, and determine that the article has no value. There is no way to distinguish this from what I consider real use. Including trivial use in the calculations results in cost per use that is lower than the true cost per use. This would lead the library to believe that it is getting a good value when in fact it is overpaying.
I am currently looking at patterns in the data to see if there is a mathematical way to “correct” the data to separate the signal from the noise.
Using Deductive Thematic Analysis to Examine Textual Documents
Joanna Thielen, Oakland University, and Amy Neeser, University of California – Berkeley
Within the library profession, content analyses of textual documents are often done using word frequency analysis, which counts the occurrences of specific words or phrases. While this method can be automated, it cannot fully account for the rich context within these documents. This presentation will discuss how to utilize a more rigorous methodology, deductive thematic analysis, to do content analyses. In deductive thematic analysis, variables are operationally defined with a codebook, with attributes listed for each variable, prior to analyzing the documents. The codebook is then used when reading the documents to note the presence of specific variables. Overall, this presentation will give audience members an introduction to this methodology, followed by a step-by-step primer of how to use it with any type of textual documents (student assignments, websites, syllabi, lesson plans, etc.), including creating a codebook, pilot testing, coding the documents, and analyzing the collected data. While this methodology requires a larger time investment than word frequency analysis, it can yield richer and more nuanced data from the textual documents.
To provide context on the application of this methodology, this presentation will provide examples from an ongoing research project: a content analysis of research data services professional job postings from 2013-2018. Analyses of job postings are often used to observe trends within the library profession. Specifically, this analysis of 183 job postings sought to determine trends in responsibilities and qualifications. The results present a picture of the landscape of research data services across the US over the past five years, including an analysis of job responsibilities and qualifications based on institutional Carnegie Classification. While not generalizable, these findings do offer strong indicators of common criteria for research data-focused positions in academic libraries.
‘They Use the Library, Don’t They?’: Investigating our Researchers’ Workflows through Qualitative Inquiry
Hilary Davis and Bertha Chang, North Carolina State University
In an effort to learn about researchers’ needs both met and unmet as well as library services valued by researchers, the NC State University Libraries conducted two qualitative studies: (1) a local study as part of the Ithaka S+R project to support civil and environmental engineering scholars, and (2) a project spanning multiple types of researchers and disciplines at NC State University. Librarians conducted face-to-face semi-structured interviews to capture deep qualitative data. The first study took a deep dive into a specific engineering discipline with a focus on faculty, while the second adopted a broader approach by including researchers from undergraduate to late career faculty across many disciplines.
In this presentation, we will share lessons learned and practical advice for librarians who are considering similar interview projects at their own institutions. We will discuss the benefits, challenges, and limitations of using interviews to learn about the workflows and needs of researchers. We will close with a brief summary of findings and actions we have taken from both studies.
Using Focus Group to Improve the Reference Services in a Liberal Arts College Library
Yiming Guo, Denison University
Denison University is a private liberal arts college in Granville, Ohio. Reference service is an intrinsic and important part of our library services. Providing quality reference services will enormously promote research and education in the Denison community. In order to critically evaluate the current reference services in Denison University Library, and improve the reference services in the future, we did an informal focus group study during the spring semester of 2018. The study is in the “pizza for your thought” format. Four focus groups with 4 to 5 students in each group were held, and invaluable students opinions, suggestions, and data were collected. Changes have been made to improve our reference services according to students’ feedback. This presentation will focus on the methodology of this informal focus group study. Experiences, lessons, and tips on issues such as IRB approval, question preparation, marketing and recruiting, timing and location, incentives, conversation moderating, note taking and recording, and result analysis will be discussed. Hopefully our findings, ideas, and research method can benefit other librarians or information professionals.
Import [Include/Exclude] Export: Using Rayyan for Database Comparison Study
Stephanie Ritchie, University of Maryland; Kelly Banyas, University of Scranton; and Carol Sevin, Kansas State University
Many methods to analyze database content and performance have been used and share one common challenge – How to manage the large volume of data needed to accurately assess databases. As part of an ongoing project to compare databases that provide agricultural research literature, our research team collected data to analyze the retrieved content of eight research literature databases. We worked with a new, free application designed to assist teams with systematic reviews. Rayyan QCRI allows teams of researchers to include/exclude citations collected during research literature retrieval based on pre-set criteria. Our team re-purposed Rayyan as a tool for reviewing search result citations for precision and recall analysis.
Ethnographic Techniques for Library Research: Examining Student Use of Open and Secluded Library Spaces, and Why it Matters.
Susan Frey and Natalie Bulick, Indiana State University
Space design in academic libraries has changed considerably in recent decades with the increasing acquisition of electronic resources. Although previous studies have examined how students use single spaces, such as learning commons and open classrooms, this study compares student behaviors between open and secluded library spaces with both structured and more comfortable seating. Researchers at a mid-sized academic library devised an observation protocol and a schedule for observation over the length of a semester. The resulting data were coded and then analyzed using Edward T. Hall’s proxemics theoretical framework. This presentation will examine the qualitative methods employed, coding process, theory of proxemics as it applies to library space usage, and the extrapolated meaning of these data as applied to the theory of proxemics. The presenters will also discuss lessons learned and make recommendations for best practices. Attendees will leave the session with the tools and suggested resources needed to conduct their own ethnographic study.
How Did You Code “I’m Really Confident That I Can Find the Exact IKEA Pillow”?: Creating an Effective Codebook as a Team
Andrea Hebert, Louisiana State University
The subject librarian for Louisiana State University’s School of Library & Information Science designed a short survey that included open-ended questions to explore the self-efficacy of graduating MLIS students. The librarian asked a graduating MLIS student to participate in the study as a co-investigator. The survey results were imported into Dedoose, a web-based application, which allowed the co-investigators to collaborate at a distance for qualitative coding and analysis. Although teams of researchers often create the initial codebook collaboratively, the co-investigators in this study each created individual codebooks using inductive coding, which draws themes from data. After the co-investigators completed their individual codebooks, they discussed their descriptions and combined the two versions into a single codebook. They used this finalized codebook to begin the iterative process of applying codes to the data. In addition to discussing the project’s research design, the presentation will discuss the initial decision to create independent codebooks and its benefits to both the librarian and graduating student. The presentation will also touch on how working with a novice researcher offers advantages beyond the normal check on personal biases gained from working in teams. The presenter will also offer practical tips for teams of researchers working on qualitative coding, including tips for teams who work in different physical sites.
Site Unseen: Website Accessibility Testing for Academic Libraries with Visually-Impaired Users
Devon Waugh and Sarah Arnold , University of North Carolina – Chapel Hill
After an Office for Civil Rights accessibility complaint in early 2017, the University of North Carolina at Chapel Hill’s University Libraries has worked to improve the digital accessibility of the library website and online tools. At first we focused on the use of automated checkers like WebAIM’s WAVE and UIUC’s Functional Accessibility Evaluator to assess and fix issues. While automated tools can detect issues, such as missing alternative text, they only evaluate a quarter of the WCAG 2.1 requirements. Following these guidelines does not replace the experiences of actual users attempting to meet their information needs on the library website or databases.
Because of this, we decided to perform a usability test of the University Libraries’ database access page with users who are low vision or blind. We adapted our protocol for running a usability test to ensure our methods were inclusive for our targeted user group. We also worked with campus partners, like the Accessibility & Resource Services, to draw from their experience working with low vision or blind students, faculty, and staff.
This session will describe how we adapted our methods and carried out usability testing with this population including what we learned along the way. In particular, we will discuss how we adapted the traditional think-aloud protocol to account for the use of assistive technology. We will also discuss our recruitment methods, which included a Qualtrics survey.
The focus of this session is on our methodologies with an interest in making time for attendees to share how they approach web accessibility and serving users with disabilities on their respective campuses.
Chat Reference: Using Existing Data to Gather Information About Usage Patterns
Megan Wilson, Murray State University
When viewing virtual reference statistics, the focus has traditionally been on the number of patrons using the service or on the complexity of the questions asked in relation to staffing needs. However, this only scratches the surface of what reference statistics and chat transcripts can tell us about library users and their needs. In order to identify how libraries and librarians can better understand users’ needs in the rapidly evolving online environment, an analysis was designed to take advantage of existing reference statistics and transcripts to identify patterns, including where users are most likely to interact with virtual reference widgets, and how this relates to the types of questions that were asked. This presentation will explore how the information was collected, organized, and analyzed, as well as the challenges that arose, and the next steps in this research process.
3D Printing and Collection Development: Lessons Learned and Future Directions.
Dorothy Ogdon, University of Alabama at Birmingham
The University of Alabama at Birmingham provides a number of funding opportunities to support professional development for faculty. One such opportunity is the Faculty Development Grant Program, an initiative that “provides money for projects to enhance the effectiveness of UAB faculty members by providing funds for them to undertake new efforts for which resources are not otherwise available.” In 2017, Ms. Dorothy Ogdon, Emerging Technologies Librarian at the University of Alabama at Birmingham received funding from this program to purchase two LulzBot Taz 6 3D Printers and use them to create a collection of 3D-printed anatomical and molecular models that were added to the UAB Libraries circulating collections. By the close of the project reporting period, a total of 20 unique 3D printed objects (12 items) had been created and added to the library’s collections. The items have been loaned a total of 32 times, and new additions to the collection are planned in 2019. In this session Ms. Ogdon will present her experiences and lessons learned through participation in the program. Updates on the status of the objects in the 3D printed collection, as well as the current state of 3D Printing in Health Sciences Libraries will also be addressed.
Aren’t We All English Majors? A Practitioner’s Guide to Humanities Approaches for Library Science
Brett D. Currier, Federal Reserve Bank of Kansas City
In the Information Age, we as library and information science practitioners have access to a significant amount of textual information, which can help us support the communities we serve. In order to maximize use of this information, we as library practitioner researchers should rely on research methods from the humanities in order to answer questions about our communities. This presentation will discuss the humanities research method of content analysis and give examples of how to use those techniques to support research in library science.
In this presentation we will discuss the following questions:
- How do we create a research question which appropriately uses content analysis as a methodology?
- How do we identify an appropriate sample?
- How do we enlarge the sample?
- When do we know that we have enough information?
Lightning Talks
Track 1: “Supporting patrons’ original research” track
Supporting Research Projects @ UVA with Data Visualizations and Geographic Information Systems
Erich Purpur, University of Virginia
Staff in the Research Data Services group at the University of Virginia’s Brown Science & Engineering Library actively support students, staff, and faculty in ongoing research efforts. The presenter will discuss recent projects in which the presenter collaborated with various students, faculty, and staff to create data visualizations, specifically using GIS softwares and Python scripting. These data visualizations play an important role in the research process and are used as supplementary materials for a publication or presentation, as a deliverable to stakeholders, or even deliver unique insights into the process that were previously not available.
RDM in the Archives?! Filling a Critical Gap for Graduate Students Performing Special Collections Research
Elena Feinstein, Duke University
This presentation describes aspects of an innovative half-day workshop for graduate students planning to perform research in special collections and archives, providing foundational knowledge and attempting to fill critical gaps in their readiness. Workshop sessions included practical information about planning visits and handling materials, discussion of ethical and legal issues, and, the focus of this talk, research data management principles. Labeled as “digital information management” during the workshop and illustrated with relevant examples, these fundamental RDM concepts are a boon to special collections researchers generating many digital files, and are rarely taught in humanities courses or extra-curricular programs. This brief talk will share lessons learned from this broadly collaborative effort.
Reaching the Basic Sciences: A Scientist Turned Librarian’s Perspective
Stacey Wahl, Virginia Commonwealth University
As a new librarian for the basic sciences, building relationships with faculty, postdoctoral fellows, and students is imperative to success. Virginia Commonwealth University Libraries has a strong history of outreach and provides support to most schools, including the School of Medicine, through a liaison model. The basic sciences departments within the School of Medicine have proved to be a more challenging nut to crack, due to preconceptions of the abilities of library staff, availability of online resources, and a silo culture. While significant inroads have been made in outreach into the postdoctoral community and into classroom instruction for literature search methods, there is still a prevailing sense among faculty that the library has limited services of value, beyond access to resources. My previous roles as a scientist have given me insight into the basic science culture; I have used this insight to build strategies to engage with stakeholders and combat that sense. This lightning talk will discuss successful, and not so successful, strategies for increasing awareness of services, marketing programming, and soliciting patrons for one on one consultations.
Wrangling Citation Managers. Is it Time for a Literacy-Based Approach?
Dorothy Ogdon, University of Alabama at Birmingham
Providing instruction designed to support student and faculty use of citation management software such as EndNote, Zotero, RefWorks, Mendeley, has become a widespread practice in academic librarianship. This type of instruction tends to be largely focused on use of a specific software package, often EndNote, rather than developing citation-management related literacies. This lightning session aims to begin a conversation on instructional activities related to citation management, and the potential for shifting the focus of instruction from single-software to literacy-based approach.
Case Study: The Open Science Framework as a Data Curation Tool
Mara Sedlins, Colorado State University, and Tobin Magle, University of Wisconsin – Madison
Data sets submitted to institutional repositories vary greatly in their size and complexity, and library responsibilities for preserving data are increasing due to the prevalence of big data, multiple data types, and challenges with curation. For larger and more complex data sets, the process of curation takes time and involves ongoing collaboration between the curator(s) and researcher(s). One example of such a project is the Mongolian Rangelands and Resilience (MOR2) NSF project database, which contains ecological, geospatial, hydro-climate and social data pertaining to rangeland use in Mongolia. To document the curation process, the curation team used the Open Science Framework (OSF), a free cloud-based management system for research projects that has powerful collaboration and version control features. This facilitated communication among team members (including the onboarding of new library staff) and allowed the librarians involved in the project to build valuable expertise and experience with reproducible research practices and tools. Attendees will learn practical approaches for tracking the curation process for complex datasets with many contributing researchers.
Track 2: “Performing research in libraries” track
Student Spaces and Technology: Critical Issues for Evaluation and Reconfiguration
Terry W. Brandsma, University of North Carolina – Greensboro
Recent collection rightsizing has expanded our footprint for individual and group study spaces. As we placed new furniture and technology in these spaces, we were guided by previous surveys and focus groups, and made certain assumption of what we thought our users might want. As we further expanded these spaces we needed to look critically at those assumptions, evaluate the use of these spaces, and determine if the technology we provide aligns with what our users need. One quick study observed the use of library-provided technology in designated group study spaces. The impetus of this study and brief results will be presented.
What Are They Saying About Us? References to Library Support & Services in Grant Proposals
Jen Ferguson, Northeastern University
Librarians working in data support roles often provide assistance with data management plans for grant proposals. While this service is offered to benefit researchers, it also has the potential to ‘give back’ information to libraries.
This talk reports preliminary results from an analysis of 175 grant proposals. It describes the types of library support and services researchers most commonly referenced in their proposals, and discusses how this information is being used to target outreach to researchers. While the proposals studied originated from a single institution, the findings may help other libraries identify practical ways to enhance support for their own researchers.
Mapping the Role of Practitioners in LIS Scholarship
Amy Trost, University of Maryland
How can librarians measure the contributions of a single research group to a larger body of scholarship? One method is to use bibliometrics and network analysis. As bibliometric tools and techniques become more widespread and accessible, librarians can build on the literature searching skills we already have to systematically measure and map the distribution of title, keyword, and abstract terms in a discipline of our choosing.
This talk will feature a case study that examines Library and Information Science (LIS) scholarship at the University of Maryland (UMD) Libraries and compares it the broader body of scholarship around academic librarianship. Data were collected from three sources: Google Scholar, EBSCO’s Library and Information Science Source (LISS) database, and the Web of Science (WoS) core collection. Technologies used to access, analyze and visualize the records included the tm and bibliometrix packages in R, VOSViewer, and Gephi. I will highlight major findings from the analysis and also provide some tips and tools for librarians that are interested in conducting their own bibliometric analysis.
An Integrated Workflow of Multi-database Bibliometric Analysis
Sarah Jeong, Wake Forest University
Bibliometric analysis, which intrinsically depends on multi-database analysis, mainly focuses on publication trends and citation analysis to understand the characteristics and advances in a research area. The objective of this presentation is to share how to utilize multiple databases for bibliometric analysis of a selected research area. This presentation will demonstrate an integrated workflow of Web of Science and Scopus for our case study. Our work can help librarians at academic institutions to recognize bibliometric analysis as a valuable assessment tool for a research area.
Piloting a Librarian Writing Group
Kimberly Powell , Emory University
At many academic institutions, librarians are encouraged to make scholarly contributions, and at academic health sciences centers, the published journal article predominates as the form in which the library’s users most frequently define their own successes in this area. However, the journal article can also be one of the more daunting scholarly outputs which librarians engage in for their own work. To increase the number of librarians regularly engaging in academic writing, a librarian writing group was piloted at one academic health sciences library. Fourteen librarians were invited to participate. Wendy Laura Belcher’s book “Writing Your Journal Article in Twelve Weeks” (ISBN 9781412957014) was selected as a guide and all eight participants rotated duties leading discussions throughout the program. The bi-weekly meetings aimed to develop habits and approaches to regularly engage in writing and break down the common planning components of the journal manuscript. Surveys at the beginning and end of the pilot measured the impact of the writing group in helping participants meet their writing goals and the extent to which participation in the group increased participants’ confidence in their ability to complete a manuscript for publication. At the conclusion of the program, all participants reported increased confidence, with four participants submitting a manuscript for publication. Administrative support from the Library Director and Associate Director, both also participants in the writing group, were found to be key factors in the success of the group, as well as a major influence in the positive feedback from remaining participants.
Rewriting the Never Ending Story Phase 1: Evaluating the Content Validity of EndNote Procedural Knowledge Assessment Questions
Megan Bell, University of Alabama at Birmingham
Objective: To assess the content validity of interactive, problem-based multiple choice and rank-order questions created to assess student procedural knowledge of EndNote
Methods: Researcher created an assessment consisting of interactive, problem based multiple choice and rank-order questions on specific EndNote tasks covered in routine one-hour EndNote sessions. A non-probability, purposive sample of EndNote instructors assessed content validity of questions; validity was assessed based on subjective feedback.
Results: Reviewers suggested altering steps students perform, changing sentence structure, reducing redundancy, changing punctuation, changing symbols and changing terminology.
Conclusion: Content validity is an important step and helps the researcher become aware of issues peer reviewers would see as weaknesses in your assessment questions.
Exploring the Wellness Behavior of Librarians: Knowing and Doing
Susan Keller, Children’s National Health System and Layla Heimlich, MedStar Health
Background
The investigators are health science librarians and we are interested in exploring the connection between access to health information and performance of healthy behaviors among librarians. We developed a survey designed to assess both quantitative and qualitative aspects of this question: “Is access to high quality health information among librarians associated with the performance of healthy behaviors?”
Methods
After scanning the research literature on wellness measures, we designed a brief survey designed to capture wellness behaviors ranging from exercise to spiritual practices.
In order to increase participation, we offered two incentives: 1) the possibility of receiving a token gift card (we received a grant from the Mid Atlantic Medical Library Association to fund these gift cards) ; and/or 2) receiving the results of the study. To maintain confidentiality, we included an optional separate link to collect information from those respondents who wanted either incentive. To maximize participation, we planned to send the survey out three times. Before submitting the survey to the Institutional Review Board, we sought advice from both a statistician to make sure that the quantitative section of the survey would answer our research question and we also consulted qualitative research experts to help us explore the “why” behind the responses. We used RedCap for the survey and sought responses from as many librarians as possible.
Results
Much to our surprise, we had 1905 responses and decided not to send the survey out multiple times. Of the respondents, 1600 wanted to receive the either/both results of the research or a gift card. The quantitative results show differences between health behaviors of health science librarians and non-health science librarians in some key areas: health science librarians are more likely than non- health science librarians to engage in healthy eating and exercise behaviors while non-health science librarians were more likely to engage in prayer or other spiritual practices. Overall, we found many similarities between the behaviors of these two populations of librarians. We are in the process of analyzing the qualitative responses.
Lessons learned
1) Ask your friends and colleagues for advice. We were fortunate to have advice from both a statistician and from qualitative experts. We also asked colleagues to “test drive” the survey and look for flaws.
2) Contact your Institutional Review Board (IRB) early in the research process. We learned that we could not call the gift card selection process a lottery. Since we have investigators from more than one institution, we needed to have one IRB designated as the IRB of record.
3) Be prepared for delays. Seemingly simple processes can be unexpectedly complex.
4) Try to think through all of the steps. Here’s a mistake we made: At the time we conducted the survey, our institution only made physical gift cards available as incentives. Unfortunately, we did not include a field to capture the street address for those respondents who were interested in possibly receiving a gift card. As a result, many more emails needed to be sent in order to capture the street addresses!
5) We gathered general demographic information but, in hindsight, we regret not including a geographic question. Different areas of the country show differences in the practice of wellness behaviors.
6) Research takes a lot of time. We are hospital librarians and one of us is a solo librarian. Finding the time to work on this project was difficult and support from our directors was essential.
7) Research is fun! We have learned a great deal during this process and we are looking forward to learning more as we complete the qualitative analysis of our data.
Recent News
More News- 2024: UT Libraries Year in Review
- In Gratitude to Nikki Giovanni
- And the winners of our Short Story Contest are…
- Congrats to Fall 2024 Graduating Library Staff and Student Workers!
- Test-drive our Future Website
- Seed Library and Art Class Collaboration Results in Art Exhibition
- UT Press Nov. 14 Panel Discussion on Opportunities for Aspiring Authors, Publishing with the Press
- A Walk Through the History of The Daily Beacon