Skip to content

Meeting report: summary of Day 3 of the 2022 ISMPP Annual Meeting

The 18th Annual Meeting of the International Society for Medical Publication Professionals (ISMPP) took place in Washington, DC on 9-11 May and was themed ‘Future-Ready Medical Communications’.  This year, 595 attendees participated in the meeting, which was held in-person for the first time since 2019.

The meeting covered a range of topics through thought-provoking and engaging keynote addresses, oral presentations, interactive sessions and roundtables. A summary of the third day of the meeting is provided below to benefit those who were unable to attend the meeting, and as a timely reminder of the key topics covered for those who did.

Read our summaries of Day 1 and the morning and afternoon sessions of Day 2 of the meeting.

Summaries of Day 3

Good Publication Practice 4 (GPP4): Major changes, lessons learned, and next steps


KEY TAKEAWAYS
  • GPP4 has been restructured to aid day-to-day usability and training for new members of the profession.
  • GPP4 will include key updates with respect to authorship, working with patients, preprints, enhanced content, and social media.

The first session on the final day of the meeting was an eagerly anticipated discussion about the development of GPP4, moderated by Lisa DeTora (Hofstra University) with panellists Faith DiBiasi (Otsuka), Laura Dormer (Future Science Group), and Eline Hanekamp (Excerpta Medica).  All the speakers were GPP4 Steering Committee members.

DeTora noted that the GPP4 manuscript was recently submitted for journal review: as such, the panellists were able to provide an overview of the document during this session. Reflecting on the GPP4 development process, DeTora noted that while negative attention surrounding the profession during the preparation of GPP3 guided its development, GPP4 has been written in a much more positive environment, as demonstrated by the coverage of ISMPP and the profession in the recent Advancements with Ted Danson segment. The profession is also growing year-on-year, with new people entering the industry and the scope of publication types expanding. DeTora explained some changes to the structure of GPP4 compared with GPP3, noting that GPP4 aims to provide a more detailed introduction to the field to make processes more understandable for stakeholders we work with, as well as to provide a platform for training new entrants to the profession. To help with the day-to-day usability of GPP4, the structure is more unified across the document, with more accessible and digestible sections for different topics and stylistic changes to improve consistency and remove redundancies.

A significant proportion of the audience had participated in reviewing GPP4, with 122 sets of comments received from ISMPP member groups and 38 sets of invited expert reviews. DeTora thanked reviewers for their valuable comments, many of which highlighted areas that required clarification, and all of which the GPP4 Steering Committee aimed to address. GPP4 aims to reflect what is best for the entire community, based on member input. Dormer also pointed out that, from the journal perspective, involvement in GPP4 development has been a very valuable process as it is important for publishers to be aware of guidelines so that their instructions for authors can be aligned.

DeTora and the panellists then discussed some of the key topics that will be addressed in GPP4:

  • alignment with the updated International Committee of Medical Journal Editors (ICMJE) recommendations on authorship
  • working with patients
  • preprints
  • enhanced content
  • social media.

The Publication Plan and ISMPP note that since the GPP4 publication has recently been submitted for journal review, it is not possible to include further details about the content of GPP4 covered during this session at the current time. Please check back after publication of GPP4, as a full summary of the discussions that took place at the meeting will be added.

Publication guidelines roundtable session

During this session, attendees were able to select from 5 roundtables, each of which were focused on a different set of publication guidelines:

  • CHEERS (Consolidated Health Economic Evaluation Reporting Standards 2022)
  • ICMJE (International Committee of Medical Journal Editors) Recommendations
  • GPP4 (Good Publication Practice 4)
  • ACCORD (ACcurate Consensus Reporting Document)
  • CONSORT (Consolidated Standards of Reporting Trials).

Attendees could then join the parallel session of their choice.

Non-clinical publications: Leveraging HEOR and RWE publications across the product lifecycle

In this parallel session, Charlotte Singh (Sanofi) and Rebecca Hahn (KJT Group) explained how product data are collected throughout the lifecycle of a drug or therapy and are provided from a range of sources beyond registrational studies. They highlighted strategic takeaways for non-clinical publications:

  • agreement on strategy for communication at each phase of development
  • publications build the overall value proposition for the product
  • cross-functional alignment is necessary to provide consistent, cohesive messaging.

It was pointed out that health economics and outcomes research (HEOR) and real world evidence (RWE) studies are conducted and publications are developed across the product lifecycle, from Discovery to Prelaunch and Launch (eg, quality of life study publications, comparative effectiveness and safety, disease and treatment burden) to Growth and Maturity (eg, budget impact models, registry studies, electronic health records and claims registries). Other non-clinical publications were also discussed, like market shaping and thought leadership publications (eg, review articles, patient journey mapping, patient preference, and disease and treatment burden).

Consequently, the scope and complexity of publication plans and strategies have evolved to account for these wide datasets. Several future considerations for more thoughtful strategic communication were raised: the increased need for cross-functional execution, for publication teams to serve as strategic partners, and for medical communications involvement early on and across the product lifecycle.

Guided Poster Tour: Patient Centric Issues

The guided poster tour focused on the following posters (the presenting authors are shown in bold):

  • Do healthcare professionals really value plain language summaries? (ENCORE) – Dawn Lobban, Jacqui Oliver, Marissa Buttaro, David Falleni and Melissa McGrath (Presenter: Amanda Boughey)
  • Reviewing scientific publications to protect the privacy of clinical study participants (ENCORE) – Colin McKinnon, Friedrich Maritsch, Ingeborg Cil, Nicole Baumgartner, Jesse Potash, Borislava Pavlova, Valérie Philippon
  • Reporting of patient-reported outcomes in the peer-reviewed literature from the mid-1990s to present day –Lisa Feder, Samantha Rivera and Larry Radican

 Defining metrics that matter


KEY TAKEAWAYS
  • When choosing metrics, start with the objective of what you want to measure rather than what you can measure, stay aware of expanding/changing metrics to ensure the best selection of those aligned with your objectives, and embrace new technological developments, such as automated dashboards, to achieve objectives.

The need to quantify impact, and return on investment, of publications is universal, and yet there is no consistently used, established, standard metric. In this parallel session, Neil Adams (Karger Publishers), Jennifer Ghith (Pfizer) and Todd Parker (MedThink SciCom) explored how we can evaluate our publication activities to improve performance of data dissemination.

Defining meaningful metrics

Firstly, we need to define the objective of the metric assessment. For example, this could be amplification of publication reach or identifying a publication gap. Once the objective of the metric is established, this can be used to identify the best metric to use.

Ask yourself:

  • what do you want to understand?
  • how can you measure it?
  • who is your audience?
  • how can metrics be optimised and disseminated?
  • what are you going to do with the information?
Assessing reach through the use of tactical metrics

Current metrics measure 3 factors: impact (eg Journal Citation Reports,), reach (eg views and downloads), and speed (time to decision, time to publication).

Many metrics are already available from journals in the form of metrics dashboards, such as that of NEJM and Elsevier. However, it is best to consider what you want to measure rather than what you can measure, and then see if this matches up with available metrics; it is always an option to reach out to publishers, as metrics such as time spent on page may not be routinely available but can usually be obtained upon request.

The current metrics landscape provides many ways to analyse how readers interact with publications. However, the truth behind the metrics can be complicated. Take, for example, the concept of ‘stickiness’ – if a person stays on the page for a long time, it could be that they were engaged, or it could be that the manuscript was confusing. Equally, an increase in social activity around a publication can be good or bad – not all news is ‘good news’.

Metrics that measure social media feeds must evolve and reflect new targets and platforms.

Metrics that measure social media feeds must evolve and reflect new targets and platforms. Bibliographic platforms such as Mendeley that allow users to quickly download publications to their library for later engagement, record a short period of attention/engagement, while summary download metrics, such as Altmetric, don’t record whether an article was later read or not. Similarly, the livestream conversation feeds for the ISMPP Annual Meeting have moved from Twitter to Whova – illustrating how metrics capturing social media activity must be flexible to change with user behaviour.

Knowledge- or behavioural-based metrics

As well as measuring the impact and reach of publications, it is important to gauge how these affect the knowledge and behaviour of the medical community. Although a publication may have a high impact factor in terms of views and citations, this does not reveal the sentiment of the ‘interaction’ with the publication. Tools such as Scite harnesses artificial intelligence to automatically assign the context of a reference citation as ‘supporting’, ‘contrasted’, or ‘neutral’. However, a pilot trial by Karger Publishers of 8-10 journals found that most results were classified as ‘neutral’ which wasn’t very informative.

The EMPIRE Index from PloS One addresses some perceived inadequacies of the current metrics landscape, such as the arbitrary weighting behind the Altmetric attention score, to include societal and social impacts. Using qualitative metrics such as these will be important when justifying future publication initiatives to colleagues.

Forward-thinking metrics

Engaging with colleagues from more active analytical functions (such as commercial and data analytics teams) can yield further insight into the best way to measure publication success and appropriate metrics. However, it is unlikely that one solution will provide all the information required to assess whether all publication objectives have been achieved.

An ABC set of criteria for establishing a framework for forward-thinking metrics was presented:

  • Accuracy – does the chosen metric really measure what is to be assessed? One should be wary of social media activity counts, which may not indicate whether learning has taken place, and third-party sites that amalgamate multiple metrics, as these may not show the full picture.
  • Brevity – use small amounts of data that require medics to transfer from one site to another, telling a story and in the process, measuring engagement and retention with that story.
  • Consistency – be consistent with what you measure and how you measure over time to obtain the most meaningful metrics.

It was noted that the best metrics are:

  • responsive to change
  • long-lasting
  • tied to business value and the target audience.

As well as optimising the metrics we use to evaluate interaction with publications, we should also be working to measure the findability of our data.

As well as optimising the metrics we use to evaluate interaction with publications, we should also be working to measure the findability of our data. Globally, HCPs spend ≥1.5 h a day on online research, and if they can’t find the data, then the metrics won’t matter. Article volume has increased, which can lead to information overload, making it difficult to find content and gain share of voice.

At the end of the session, the panel answered questions from the audience. Regarding the need for benchmarking the effect of a publication and the difficulty of interpreting metrics such as article download numbers, the panel suggested benchmarking against articles in the same issue of the journal. However, they cautioned that a lot of data need to be assimilated to achieve in-depth benchmarking. When asked if they use metrics to inform their publication plans, the panel responded that metrics were used to inform journal choice, and to see if the development of article enhancements are worth continuing. Lack of consistency in metrics reporting between publishers was raised, particularly with regard to page views, but the panel were able to confirm that there are as yet no discussions among publishers about having a consistent metrics platform, so authors/publication managers will still have to contact journals for this information on an individual basis. ISMPP provides several resources about best practice for metrics assessment, such as reports from the Social Media & Web-Based Metrics Working Group, which can inform journals, as well as industry members and authors.

Behind the infographic magic: How strategic cross-functional collaboration drives better content and engagement


KEY TAKEAWAYS
  • Infographics are effective tools to deliver scientific messages. Cross-functional collaboration between stakeholders and enhanced visualisation of data are key to refine infographic content and boost audience engagement.

In this parallel session, Gerry Johnson (ICON), Amy O’Connell (ICON), Tanya Brinsden (ICON) and Maya Shehayeb (Amgen) presented a strategic approach to developing infographics, combining design principals with data translation using enhanced visualisation.

Data show that people scan scientific posters for no more than 20 seconds before deciding whether to read on in more detail. Lots of dense text and figures can bury the important scientific message, meaning they get missed.

Lots of dense text and figures can bury the important scientific message, meaning they get missed. Infographics are an effective tool to quickly capture an audience’s attention.

Infographics are an effective tool to quickly capture an audience’s attention and allow data to come through clearly. Pairing design principles, ie clear communication and an organised story, with enhanced visuals helps to translate data, increase audience engagement, and drive impact. Cross-functional collaboration between scientific, creative, and other key stakeholders is a crucial component to the success of this process.

Understanding how an audience comprehends information is an important first step. Visual iconography can help draw an audience and aid their comprehension by acting as a mental shortcut. Of equal importance is an appreciation of what types of visual aids work best in different venues: what works well in a small room may not work well in a large auditorium.

The speakers outlined a phased process of developing infographics with an enhanced visual approach:

  • Evaluation and planning – assess the audience and venue to identify the major points to be presented (use links and/or supplemental materials for complete data). Consider the layout and organisation of data.
  • Translation – translate data using an enhanced visual approach to improve comprehension without oversimplifying (eg iconography, visual hierarchy, interactivity, and graphical visualisation).
  • Refinement – check that the content is reading correctly, and it is an accurate representation, then edit and re-evaluate.
  • Review and approval – seek approval from key stakeholders and ensure the layout is fact checked before submitting for medical and legal review. Incorporate any final edits.

Enhanced visual approaches are well received by audiences and metrics demonstrate that the increased investment is well spent in terms of impact and readership.

Testimonials have shown that an enhanced visual approach is well received by audiences and although budgetary and time constraints can be barriers to implementation, metrics demonstrate that the increased investment is well spent in terms of impact and readership.

ISMPP Authorship Algorithm Showcase


KEY TAKEAWAYS
  • The ISMPP Authorship Algorithm Tool will soon be available for use via the ISMPP website.
  • Based on a clear interpretation of ICMJE criterion #1, the tool can assist publications professionals with conversations and decisions surrounding authorship eligibility.

As previewed during this year’s ISMPP European Meeting, the ISMPP Authorship Algorithm Tool will soon be available for publication professionals to use, following testing and feedback on the pilot version. Karen Mittleman (member of the ISMPP Authorship Algorithm Task Force) opened the session with a historical perspective on how the ICMJE authorship criteria are related to Good Publication Practice, noting the age-old challenge of defining and quantifying authors’ contributions against ICMJE criterion #1. This issue led to the creation of the ISMPP Authorship Task Force. As described by Maggie Hodgson (Merck & Co.), and as later highlighted by Rob Matheis (ISMPP President and CEO), practical guidance on this matter has been long overdue.

[So many of the questions I receive] are all about authorshipwhat are we going to do, how do we know if this person qualifies [for authorship] or not? So [the tool] was kind of an obvious thing to do that was long overdue.” – Rob Matheis, ISMPP President and CEO

Scott Thompson (Acceleration Point) outlined some of the technology under the hood of the online tool, which has been designed to easily integrate into users’ existing Microsoft ecosystems, along with forthcoming implementation aids that will be available to users on the website. Thompson then handed over to Jeff Clemens (Eli Lilly) who presented a video demonstration of the tool. Clemens walked through the steps required to set up a project, choose the appropriate weightings for the algorithm based on the type of project, and ultimately generate objective contribution scores for each potential author against the various facets of ICMJE criterion #1.

With the website soon to be up and running, Sonia Schweers (Bristol Myers Squibb) closed the session with a call to action, appealing for people to use the tool. Schweers noted that the more the tool is used in discussions surrounding authorship, the better it will become (through updates based on user feedback) and the easier it will be to implement as it gains credibility among the biomedical publishing community.

Why not also read the summaries of Day 1 and the morning and afternoon sessions of Day 2 of the meeting?

——————————————————–

Written as part of a Media Partnership between ISMPP and The Publication Plan, by Aspire Scientific, an independent medical writing agency led by experienced editorial team members, and supported by MSc and/or PhD-educated writers.

——————————————————–

Never miss a post

Enter your email address below to follow our blog and receive new posts by email.

close

Never miss
a post

Enter your email address below to follow The Publication Plan and receive new posts by email.

We don’t spam! Read our privacy policy for more info.

Leave a Reply

%d bloggers like this: