• Home
  • EMP SIG Annual Report 2015

EMP SIG Annual Report 2015

Electronic Media/Photo Special Interest Group

Spring 2015 Annual Report

Mission statement:

To continue year-round dialogue pertaining to electronic media and photography and how they can be better used to communicate the findings of our institutions to the public we serve, while expanding the professional development and communication opportunities of our members.

Officers (current year):

  • Chair: Arlene Robertson
    Ontario Ministry of Agriculture, Food & Rural Affairs
  • Vice Chair: Jeff Hino
    Oregon State University
  • Vice Chair-elect: Dave Deken
    Oklahoma State University 

Officers (next year):

  • Chair: Jeff Hino
    Oregon State University
  • Vice Chair: Dave Deken
    Oklahoma State University 
  • Vice Chair-elect:  to be decided at the SIG meeting in Charleston

Did the SIG present an Award of Excellence this year? >
No nominations were received.

 

What were your goals for the year?

  • To communicate regularly throughout the year with SIG members and include them in the decision making process.
  • Provide responsive professional development opportunities to meet the membership needs and keep the discussions ongoing throughout the year (e.g. SIG Facebook page, newsletters).

What successes did you have in reaching those goals? 

  • A newsletter went out to members (approx. every 2 months) highlighting upcoming professional development opportunities, ACE activities and asking for feedback on topics.
  • Increase in the number of Facebook views and postings by members
  • We hosted 3 professional development Webinars:

ü>  >April 23 - Digital Asset Management Hangout

  • 12 live participants
  • 73 views of YouTube recording (April 23 – May 5, 2015)
  • Working with ACE, we piloted a peer judging format for the Photography Category of the C&A program based on member feedback (see Appendix 1).
  • We secured judges for the Audio and Video categories of the C&A program
  • We made improvements to the Photo C&A category to better reflect emerging technology and techniques.
  • We reviewed and ranked conference session proposals for Charleston 2015.
  • Planned the SIG conference meeting

What barriers kept you from achieving the goals?

  • With the help of the executive and member support, we achieved our goals for the year.

What would you have done differently?

  • Nothing

What professional development has been carried out?

We hosted 3 professional development Webinars:

ü>  >October 8, 2014 -  Shameless Social Media

ü>  >December 9, 2014 - Anatomy of an Award-Winning Photo
>Number of people who watched the live streamed Google Hangout On Air – 16

Number of views on archived YouTube (from 12/9/14 – 1/5/15) -  109

Estimated minutes watched – 1,335

Number of photo awards won by one of our panelists while actually doing the Hangout – 1

ü>  >April 23, 2015 - >Digital Asset Management >Hangout

Number of people who watched the live streamed Google Hangout On Air - 12

Number of views on archived YouTube (from 04/ 23/15 – 05/05/15) - 73

How did you communicate with members?

How did officers divide responsibilities?

  • The Chair’s role is reporting, communication (currently through an e-newsletter and SIG’s Facebook group page,) as well as assisting with the coordination of professional development sessions and general oversight.
  • The Vice chair coordinates C&A Awards, professional development, EMP SIG website maintenance.
  • Vice Chari Elect assists with C&A and professional development, EMP Facebook page maintenance. The vice-chair elect contributes as they are able.
  • All officers vote on the SIG Award of Excellence winners, coordinate professional development and review and recommend conference sessions, update the ACE website, and work on items assigned by ACE.

What can the Board do to help SIG leaders in their responsibilities and to make SIGs more effective?

  • Notify the SIG when new members join
  • Include in the SIG Handbook information on >practical, detailed instructions/guidelines for hosting/scheduling SIG webinars or Google Hangouts
  • Include in the SIG Handbook information on how to post to the SIG site on the ACE website
  • Post the SIG leader handbook on the ACE website.
  • Identify SIG leadship at the conference (e.g. ribbon on name tag to promote the SIG and start a discussion)
  • Add professional upcoming development events on the ACE homepage in a very obvious way.  They are currently hidden deep inside the SIG pages (if the presenters even remember to put it there)!

What other concerns or thoughts would you like to share with the Board?

The ACE website is pretty funky.  Adding new material is absolutely byzantine.

APPENDIX 1     Pilot of a Peer Review Process for Judging C&A Category 7 Photography

The EMP SIG piloted the use of peer-review for the ACE C&A awards in the photography category. This was an effort to respond to several issues historically associated with the C&A:

  • Lack of consistent, timely and useful feedback from judges on the C&A review
  • Difficulty in finding judges
  • Relatively small number of entries in some categories

In addition, a peer-review approach could bring added value:

  • Recognition by peers who best understand the nature of ACE audiences, challenges, and missions.
  • Faster turnaround of feedback (critiques) to applicants
  • If successful, this approach could be applied to other categories in the EMP C&A. Other SIGS might also be interested.

Based on feedback from members, the Photography Classes were revised to reflect the interest of the SIG. Class 8b: Photo Series was removed and Class 10 was renamed to reflect the use of various types of photo software for post processing.

Class 7a: Feature photo >- a single image that tells a story effectively in one shot.  

Class 7b: Environmental portrait or personality photo >– photo of a person taken at work or in a setting related to their work. 

Class 8a: Picture story >- three or more photos used to tell a story. 

Class 9: Service photo >- For those pictures that are required as part of the mission of the institution. Examples would be shots of events such as fairs, field days, award presentations, etc.

Class 10 -  Photo Enhanced >- For photos made entirely under the photographer's control, including setups, studio still life and/or pictures extensively manipulated through photo software such as Photoshop, mobile apps, etc.

Submission format: No change to the submission process. Entries in classes 7 to 10 were submitted online using the Fasttrack System. No physical entries were accepted.

Class # of 2012 entries # of 2013 entries # of 2014 entries # of 2015 entries
Class 7a 8 11 11 18
Class 7b 7 5 13 10
Class 8a 3 2 3 7
Class 8b 3 4 5 n/a
Class 9 0 2 2 4
Class 10 0 3 3 9

Survey Process:

  1. Each entry requires an entry form and supplemental information document (250 word maximum). For details, see the general ACE C&A rules and instructions at:  https://www.aceweb.org/index.php/en/rules-a-instructions.
  • The entry to include: Purpose (goals, objectives, need)
  • Audience (type and size of target audience)
  • Marketing/promotion (describe important elements)
  • How diversity was incorporated into your entry. Does this project's creative concept show consideration for inclusion of all members of a potentially diverse audience (people of various races, genders, socioeconomic classes and other points of human diversity)?
  • Other information (production costs, special circumstances faced during production and any other details the viewers should consider while evaluating your entry)
  1. Photos, entry form and supporting information were submitted online to ACE using the Fasttrack system.
  2. ACE provided the EMP SIG leadership with access to the Fasttrack system.
  3. EMP SIG used Survey Monkey to host the photos and supporting information.
  4. The survey opened on February 10 and closed on March 4, 2015

Scoring:

  1. All members of the EMP SIG received an email inviting them to participate in the peer review, explaining how the process worked and supplied the URL for the online survey.
  2. Best practices and considerations for evaluation were outlined in the email to the SIG members (i.e. “Remember, it could be YOU who receives this feedback, so make it valuable and meaningful.”).   The lack in the amount of commentary received from any one peer reviewer was made up by the volume of reviewers (currently there are 70+ members in the EMP SIG.) Even with only 10% response rate an entrant will receive 3-to-4 times more feedback than supplied by a single judge under the old system.
  3. The survey limited responses to one per unique IP address.
  4. Scoring was based on the following:
    Technical Quality
    Creativity/originality
    Audience Interest/Impact
    Overall Evaluation

(out of 25, where 1 = lowest, 25 = highest)

  1. The images with the highest scores won their respective category.
  2. In the case of a tie score in a category, the finalist was decided by the EMP SIG leadership team (Chair, Vice-Chair, and Chair-elect.)
  3. The highest score for overall evaluation was used to determine the Outstanding Professional Skill Award.

 

Post Judging Process:

A short survey was sent to members (March 16 – April 1, 2015) to gather feedback on the peer judging process.

 

 

Results of the Pilot

Judging - 30 members judged the photography entries

The Photography classes saw an overall increase in entries for the 2015 program

Post Survey Comments

 

9 people answered the 5 questions in the post judging survey (see attached).

In general:

45% felt the process met the goals of the pilot

55% felt the process was easy to use

55% felt enough information was provided to make an informed decision

55% liked judging photos using this process

Survey Monkey may not be the ideal platform to host this process as members would have liked to have seen all photos side by side rather than in a scrolling vertical format.  The images were not displayed on the same page so that they could be compared for prospective gold, silver and bronze winners.

Members would have preferred to see larger images that what the software would allow. High resolution images were not available. Members felt it was difficult to judge image quality from a small low resolution 300 ppi image.

Survey took approx. 2 hours to judge thoroughly.

Summary

The SIG will discuss the pilot and the results at the annual meeting in Charleston and decide how to proceed for next year’s C&A.

©2018 Association for Communication Excellence in Agriculture, Natural Resources, and Life and Human Sciences
Powered by Wild Apricot Membership Software