Title: Issues of consistency in defining slices for slicing metrics: ensuring comparability in research findings Abstract: Replicating previous studies is important as this allows an evidence base for a topic to be grown and more rigorous conclusions drawn. In this talk we report our replication of Meyers and Binkley's (2007) very well regarded study investigating the efficacy of program slicing metrics. Our results show that there are a variety of opportunities for inconsistently to creep into the collection and analysis of program slicing data during replication. Our findings suggest that it is currently difficult for subsequent researchers to accurately replicate previous program slicing studies such that consistent and reliable data can be added to a body of evidence. We conclude that a replication framework is needed to enable empirical work to be presented with sufficient detail for replication to be possible. Without this it is difficult for a mature evidence base to develop for program slicing.