0% found this document useful (0 votes)
92 views5 pages

Manos Cohen 2015

Rejuvenecimiento de manos

Uploaded by

Lore Acevedo F
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
92 views5 pages

Manos Cohen 2015

Rejuvenecimiento de manos

Uploaded by

Lore Acevedo F
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 5

ORIGINAL ARTICLES

A Randomized, Blinded Study to Validate the Merz Hand


Grading Scale for Use in Live Assessments
Joel L. Cohen, MD,* Alastair Carruthers, MD, FRCPC,† Derek H. Jones, MD,‡
Vic A. Narurkar, MD,x Martin Wong, BA,k Lisa N. Cheskin, MPH,k J. Richard Trout, PhD,¶
and David J. Howell, PhD#

BACKGROUND The Merz Hand Grading Scale (MHGS) is a 5-point scale used to grade appearance of the
dorsum of the hand. The MHGS has been previously validated for assessment of photographed hands but not
for live assessment.

OBJECTIVE The purpose of this randomized, blinded study was to validate the MHGS for live assessment of
the hands in the clinical setting.

METHODS Three physician raters completed a scale qualification program that included MHGS training,
ratings of standardized hand photographs, and statistical analysis for reliability. Eighty-four subjects (28 males,
30% Fitzpatrick skin Types IV–VI, mean age of 42 years), randomized to 2 live assessment sessions for inde-
pendent and blinded observation of dorsa of their right hands, completed the study.

RESULTS Overall MHGS intrarater weighted Kappa value was 0.74 (0.68–0.79 [CI 95%]). First- and second-
time hand-rating agreement scores ranged from 64% to 75%. Interrater weighed Kappa values ranged from
0.59 to 0.71, representing between-rater paired results of each combination of raters.

CONCLUSION High-weighted Kappa values and agreements demonstrate that consistency at different time
points can be achieved individually and by different raters for live assessments. The MHGS is a suitable
instrument for live assessment in the clinical setting.

This study was sponsored by Merz North America, Inc. Dr. J. L. Cohen has served as a consultant for Merz.
Dr. A. Carruthers has also served as an investigator and consultant for Merz GmbH. Dr. D. H. Jones is an
investigator and consultant for Merz. Dr. V. A. Narurkar has served as a consultant and investigator for Merz.
Mr M. Wong is a clinical project director for Merz North America. The other authors have indicated no
significant interest with commercial supporters.

S cale validation demonstrates that an instrument


has been subjected to rigorous scientific scrutiny and
can be used in confidence. In 2012, the Journal of
et al6 previously reported the results of their efforts to
develop a validated hand grading scale, which is now the
Merz Hand Grading Scale (MHGS). The MHGS
Dermatologic Surgery published studies of exploration of (Figure 1) has been previously validated for use on
validated scales to include specific areas of the face: global photographic assessments of the hands and was the scale
face,1 upper face,2 mid face3 lower face,4 and neck of choice for use in the Food and Drug Administration
volume.5 Collectively, these scales have promulgated registration trial for correction of volume loss in the
a systematic approach to aesthetic medicine. Carruthers dorsum of the hands with calcium hydroxylapatite

*AboutSkin Dermatology and DermSurgery, Englewood, Colorado, and Department of Dermatology, University of
Colorado, Aurora, Colorado; †Department of Dermatology and Skin Science, University of British Columbia, Vancouver,
Canada; ‡Skin Care and Laser Physicians of Beverly Hills, Los Angeles, California; xBay Area Laser Institute, San
Francisco, California; kMerz North America, Raleigh, North Carolina; ¶Yardley, Pennsylvania; #San Francisco, California

© 2015 by the American Society for Dermatologic Surgery, Inc. Published by Wolters Kluwer Health, Inc. All rights reserved.
· ·
ISSN: 1076-0512 Dermatol Surg 2015;41:S384–S388 DOI: 10.1097/DSS.0000000000000553

S384

© 2015 by the American Society for Dermatologic Surgery, Inc. Published by Wolters Kluwer Health, Inc. Unauthorized reproduction of this article is prohibited.
COHEN ET AL

Figure 1. The Merz Hand Grading Scale (MHGS).

(Radiesse; Merz North America, Inc., Raleigh, NC).6 The Inclusion Criteria
MHGS underwent a robust validation process before use Subjects had to have an evaluable right hand, without
in the US pivotal trial, and this validation for live any uniquely identifiable features such as scars, tat-
assessments was a critical component to demonstrate toos, or an excess of hair that could potentially identify
utility within the clinical setting. their hand at either rating session. In addition, subjects
had to be 18 years of age or older, be representative of
a wide range of ages, exhibit the full spectrum of the
Objective of the Study
Fitzpatrick Skin Types, and be competent to provide
The purpose of this study was to validate the MHGS for written consent.
live assessments of the dorsal side of hands with all skin
types, before and after treatment with soft-tissue fillers. Evaluator Training
Three board-certified dermatologist physician raters
(AC, JC, and DJ) were selected from the team of
Methods and Materials
physician experts who photographically validated the
Study Design

TABLE 1. Live Subject Demographics (N = 84)


Subject Demographics
A total of 86 subjects were enrolled and randomized to Age, years
2 live assessment sessions, with each session repre- Mean (SD, range) 42.1 (19.4, 18–89)
senting an independent and blinded observation of Gender, N (%)
Female 56 (67%)
a participant’s right hand. Eighty-four subjects com-
Male 28 (33%)
pleted both live assessment sessions and were included
Fitzpatrick Skin Type, N (%)
in the analysis (2 enrolled subjects did not present for Type I 7 (8%)
the second session). The mean age was 42.1 years. Type II 36 (43%)
Thirty-three percent of the subjects were male; 30% of Type III 16 (19%)
the subjects were of darker skin types (Fitzpatrick Skin Type IV 16 (19%)
Type V 3 (4%)
Types IV–VI7). Subject demographics are provided in
Type VI 6 (7%)
the Table 1.

41:12S:DECEMBER SUPPLEMENT 2015 S385

© 2015 by the American Society for Dermatologic Surgery, Inc. Published by Wolters Kluwer Health, Inc. Unauthorized reproduction of this article is prohibited.
RANDOMIZED, BLINDED HAND SCALE VALIDATION

MHGS. Before using the MHGS in this live validation Before participation, subjects provided written
study, the raters completed the scale qualification authorization to have their hands evaluated by the
program that included: physician raters and to have nonidentifying photo-
graphs (Canon EOS Rebel XTi DSLR, Canon,
• Completion of the MHGS training webinar con- Melville, NY) taken of their right hands (Figure 2).
ducted by the sponsor, using photos and consensus Concealment of unique finger characteristics was done
ratings from the 2007 photographic validation. by a sponsor proctor before visualization and grading
• Rating of 25 hand photos in Photo Booklet #1 of a hand by a physician rater by placing a piece of
(PB1). All raters returned PB1 to the sponsor black material over the fingers while allowing the
before receiving the second of the 2 booklets. dorsum of the hand to be visible. Subjects were
• Rating of 25 hand photos in Photo Booklet #2 assigned a unique subject ID number and corre-
(PB2). PB1 and PB2 contained the same photos, sponding randomization assignments for each of the 2
but randomized in a different randomized rating sessions. The randomization assignments were
sequence. created before the study by the biostatistician (J.
• Statistical analysis of the evaluator qualification Richard Trout, PhD; Yardley, Pennsylvania).
data by the biostatistician to determine intrarater
and interrater agreement was assessed using Each rater was assigned to separate, nonadjacent
weighted Kappa values, percent of exact agree- evaluation rooms and an evaluation room proctor.
ment between PBs, and percentage agreement to The proctor ensured that the subjects were evaluated
the 2007 consensus ratings. in the correct sequence for both rating sessions.
The proctors and raters complied with the protocol
All hand photos used in the training webinar and requirements to optimize blinding by
photo booklets were sourced from the Merz Frankfurt
(Germany) photographic library created for the photo • Not allowing raters to observe subjects entering
validation. The photo validation consensus ratings and exiting the room
corresponding with each photo were used to create • Not allowing conversation between subjects and
a guidance tool for use during qualification. raters
• Properly positioning the hand to be evaluated
Live Training and Validation under the custom curtained frame for live assess-
A board-certified dermatologist (VN), who was not ment (Figure 3). The curtained frame prevented the
among the physicians who performed the live assess- rater from seeing the subject’s face, upper torso,
ments, performed the live hand screening of potential and arms and helped to prevent association of
subjects, including grading of the Fitzpatrick Skin Types. recognizable features, such as faces and clothing.

Figure 2. Sample hand photographs for use by physician evaluators.

S386 DERMATOLOGIC SURGERY

© 2015 by the American Society for Dermatologic Surgery, Inc. Published by Wolters Kluwer Health, Inc. Unauthorized reproduction of this article is prohibited.
COHEN ET AL

Kappa values with scores ranging from 0.81 to 0.90


(protocol requirement: $0.60). The rate of agreement
with the qualification guidance tool ranged from 52%
to 60%. Percent of exact agreement between the first
and second photo rating was also good, ranging from
80% to 88%. Regarding interrater agreement, Raters
1 and 2 were in high agreement with each other
(Kappa score = 0.84). Rater 3 showed only moderate
agreement with Raters 1 and 2 (Kappa scores 0.52 and
0.50, respectively). The interrater results of Rater 3
required retraining by the sponsor before the live val-
idation. During the retraining, the 3 physician raters
met with the sponsor trainer and discussed discrepant
Figure 3. Proper positioning of the hand under the cus-
tom-made curtained frame for live assessment. scores to achieve rater alignment before the live
validation.

There was a >60-minute pause between each rater’s


first and second ratings of the 84 hands during the live Live Validation
assessment validation day. The study was designed The intra- and interrater agreement analysis for live
with a large number of subjects to minimize likelihood validation was also very consistent within raters
of rater recall of MHGS ratings between sessions. (intrarater reliability), as demonstrated by weighted
Raters were not allowed to interact with each other Kappa values with scores ranging from 0.65 to 0.81
during the validation study. (protocol requirement: $0.60). The overall intrarater
reliability weighted Kappa was 0.74 (95% CI 0.68–
Each rater had a total of 168 case report form (CRF) 0.79). Regarding interrater agreement, Rater 3 was in
ratings for subjects completing both rating sessions, good agreement with Raters 1 and 2 with Kappa
each representing an independent observation of scores of 0.68 and 0.71, respectively. Moderate
a right hand. agreement of Rater 1 with Rater 2 (Kappa score 0.59)
approached the protocol requirement ($0.60). Exact
Study Results agreement between the first and second live assessment
hand ratings was also good, ranging from 0.64 to 0.75
Subject Compliance (protocol requirement: $0.50).
Of the 86 subjects providing written authorization to
participate and subsequently being screened, 84 Discussion
completed both rating sessions. During the second
rating session, the randomization sequence was This was a large-scale study of a diverse patient pop-
adjusted to account for 2 subjects who did not present ulation, both in terms of skin types and of gender
and for <5 subjects who were inadvertently delaying distribution. Both intra- and interrater weighted
their scheduled hand rating appointments. During the Kappa scores were consistently within a narrow and
validation study, proctors and the sponsor reconciled relatively high range. The overall intrarater weighted
CRFs to ensure data accuracy and integrity. Kappa value stratified over the 3 raters was 0.74 for
live assessments of the MHGS. These intrarater
weighted Kappa values ranged from 0.65 to 0.81.
Evaluator Training Additionally, the percent of exact rater agreement
The intra- and interrater agreement analysis for scale between their first and second ratings of the same hand
qualification was very consistent within raters (intra- ranged from 64% to 75%. These weighted Kappa
rater reliability), as demonstrated by high weighted values and high percentages of exact agreement

41:12S:DECEMBER SUPPLEMENT 2015 S387

© 2015 by the American Society for Dermatologic Surgery, Inc. Published by Wolters Kluwer Health, Inc. Unauthorized reproduction of this article is prohibited.
RANDOMIZED, BLINDED HAND SCALE VALIDATION

demonstrate that the MHGS can be used consistently other hand, the weighted Kappa values suggest that
by the same rater at different time points for live the evaluators either quickly acquired or already
assessment of hands. possessed notions of MHGS ratings that were fairly
congruent.
The interrater weighted Kappa values ranged from
0.59 to 0.71, demonstrating that the MHGS can be
Conclusions
used consistently by different raters at different time
points for live assessment of hands. Although the High-weighted Kappa values and agreements for
Rater 1 and Rater 2 Kappa was 0.59 and did not meet individual raters and across raters demonstrate that
the protocol requirement, the difference was deter- consistency at different time points can be achieved
mined not to significantly impact the overall suitability individually and by different raters for live assessments
of the assessment tool. The results of this validation using the MHGS. The MHGS is a suitable instrument
study conclude that the 5-point MHGS is considered for live assessment of hands in the clinical setting.
suitable for live assessments in clinical studies to grade
dorsal hand condition. Acknowledgments The authors express their sincere
appreciation to Lisa N. Cheskin, MPH, for her
Although the hand scale and its photographs that we direction as Vice-President of Clinical Affairs during
used have already been published (in a study that this study; to J. Richard Trout, PhD, for biostatistics;
includes 3 of the authors of the present article), this live and to David J. Howell, PhD, for his editorial efforts.
assessment project demonstrates how this scale can be
used for live patient evaluations and shows that it was References
well designed to achieve both intraobserver and
1. Rzany B, Carruthers A, Carruthers J, Flynn TC, et al. Validated
interobserver consistency. We believe that this study composite assessment scales for the global face. Dermatol Surg 2012;38:
294–308.
can help increase consistency across clinical settings
2. Flynn TC, Carruthers A, Carruthers J, Geister TL, et al. Validated
for live patient evaluations and is an important con-
assessment scales for the upper face. Dermatol Surg 2012;38:309–19.
sultation tool to be used with the patient at the time of
3. Carruthers J, Flynn TC, Geister TL, Görtelmeyer R, et al. Validated
live assessment to discuss treatment of volume loss in assessment scales for the mid face. Dermatol Surg 2012;38:320–32.
hands. In terms of direct applicability, the study sup- 4. Narins RS, Carruthers J, Flynn TC, Geister TL, et al. Validated
ports the value of the MHGS in live assessments of assessment scales for the lower face. Dermatol Surg 2012;38:333–42.

patients who are considering treatment of the dorsum 5. Sattler GS, Carruthers A, Carruthers J, Flynn TC, et al. Validated
assessment scale for neck volume. Dermatol Surg 2012;38:343–50.
of the hands with soft-tissue fillers to address signs of
6. Carruthers A, Carruthers J, Hardas B, Kaur M, et al. A validated hand
aging. grading scale. Dermatol Surg 2008;34:S179–183.

7. Fitzpatrick TB. The validity and practicality of sun-reactive skin types I


Although this was a large-scale study, with a diverse through VI. Arch Dermatol 1988;124:869–71.

population, it was not without its limitations. Bias


does not seem to be present for representation of
Address correspondence and reprint requests to: Joel L.
Fitzpatrick Skin Type scores of subjects but sub- Cohen, MD, 499 E. Hampden Avenue, Suite 450,
jectivity of recruitment cannot be ruled out. On the Englewood, CO 80113, or e-mail: [email protected]

S388 DERMATOLOGIC SURGERY

© 2015 by the American Society for Dermatologic Surgery, Inc. Published by Wolters Kluwer Health, Inc. Unauthorized reproduction of this article is prohibited.

You might also like