[ecoop-info] CFP: Special Issue on: "Next Generation of Empirical SE" in Automated Software Engineering Journal
Burak Turhan
Burak.Turhan at oulu.fi
Fri Jun 22 17:38:54 CEST 2012
================================================
Call for papers
Automated Software Engineering Journal
Special Issue: Next Generation of Empirical SE
GUEST EDITORS:
Stefan Wagner (University of Stuttgart)
stefan.wagner at informatik.uni-stuttgart.de
Tim Menzies (West Virginia University)
tim at menzies.us
================================================
It is well-established that, using data mining, we
can predict properties of a wide range of software
engineering products (e.g. [1,2]). Now, it is
time to ask “what’s next?”.
We ask for papers that explore “what’s next”.
Based on all our past experience on data mining
for SE, what can we now say about:
o New pre-processing: before running the miners,
what extra processing do we now know is required?
o New generalities: what new tools, methods or
generalities might be proposed?
o New usage issues: what new issues have emerged?
o New ways to use these tools: after the
prediction is made, what do we now know about how
different communities use these tool?
For this journal special issue, we seek archival
contributions (not speculative proposals). The
papers must describe mature results with strong
evaluations. Papers must discuss automated methods
for addressing issues relating to the next
generation of empirical SE. Those issues include,
but are not restricted to the following:
o Data quality issues (e.g. [3]);
o Ensemble learning methods (e.g. [4]);
o Tool (mis)usability issues (e.g. [5]);
o Support for managerial decision making issues
(e.g. [6]);
o Data privacy issues (e.g. [7]);
o Cross-company learning issues (e.g. [8]);
o Predicting the quality of a software system
(e.g. [9]).
This call for papers is open to all researchers.
IS IT A NEXT GEN PAPER?
It is a requirement for all submissions to the
special issue to have a section called “Empirical
SE, V2.0” that discusses next gen issues; i.e. how
their work fits into the broader picture beyond
just building a predictor (see notes above).
PUBLIC DATA
Papers are required to offer verifiable results;
i.e. they must be based on public-domain data
sets or models. Submissions should come with an
attached note offering the URL of the data/model
used to make the paper's conclusions. A condition
of publication for accepted papers is that their
data/model must be transferred to the PROMISE
repository (http://promisedata.org/data) prior to
final acceptance. That data/model must be in a
freely accessible format (i.e.no proprietary
formats).
DATES
Jan 1 2013 : submission
April 1, 2013: reviews, round 1
June 1, 2013 : resubmit revised papers
SUBMISSION
Submit to http://www.editorialmanager.com/ause/,
adhering to the instructions for authors at
http://www.springer.com/computer/ai/journal/10515.
On submission, please include a note saying "For
the special issue on Next Gen Empirical Methods".
REFERENCES
1. Hall, T.; Beecham, S.; Bowes, D.; Gray, D.;
Counsell, S.; , "A Systematic Review of Fault
Prediction Performance in Software Engineering," ,
Pre-print IEEE Transactions on Software
Engineering: http://goo.gl/FOiT9
2. Dejaeger, K.; Verbeke, W.; Martens D.; Baesens,
B; "Data Mining Techniques for Software Effort
Estimation: A Comparative Study". IEEE
Transactions on Software Engineering, 2012.
http://goo.gl/eZ8RS
3. Gray , D.; Bowes, D.; Davey, N.; Sun, Y.;
Christianson, B., “The misuse of the NASA metrics
data program data sets for automated software
defect prediction”, IET Seminar Digest, 2011
http://goo.gl/QE5au
4. Kocaguneli, E.; Menzies, T.; Keung, J.; , "On
the Value of Ensemble Effort Estimation". IEEE
Transactions on Software Engineering,
http://goo.gl/0LWKZ
5. Shepperd, M.; Hall, T.; Bowes, D.: “A
Meta-Analysis of Software Defect Prediction
Studies”. http://goo.gl/qtc9o
6. Heaven, W.; Letier, E.: “Simulating and
optimising design decisions in quantitative goal
models”. RE’11 http://goo.gl/7bGJc
7. Peters, F.; Menzies, T.: “Privacy and Utility
for Defect Prediction: Experiments with MORPH”.
ICSE’12. http://goo.gl/hF4Un
8. Turhan, B.; Menzies, T.; Bener, A.; Distefano,
J.: "On the Relative Value of Cross-Company and
Within-Company Data for Defect Prediction".
Empirical Software Engineering, 2009.
9. Wagner, S.: “A Bayesian Network Approach to
Assess and Predict Software Quality Using
Activity-Based Quality Models”, Information and
Software Technology, 2010
More information about the ecoop-info
mailing list