digital forensic within enterprise risk management

profileseshi
metricsbasedriskassessmentandmanagementofdigitalforensics.pdf

D EF

EN S E

A C

Q U

IS IT

IO N

UN IVERSITY ALU

M N

I A S

S O

C IA

T IO

N

R E

S E

A R

C

H PA

PER COMPETIT IO

N

2010 ACS

2ndplace METRICS-BASED Risk Assessment and Management of DIGITAL FORENSICS

Mehmet Sahinoglu, MSgt Stephen Stockton, USAF (Ret.), Capt Robert M. Barclay, USAF (Ret.), and Scott Morton

Driven by the ubiquity of computers in modern life and the subsequent rise of cybercriminality and cyberterrorism in the government and defense industry, digital forensics is an increasingly salient component of the defense acquisi- tion process. Though primarily located in the law enforcement community, digital forensics is increasingly practiced within the corporate world for legal and regulatory requirements. Digital forensics risk involves the assessment, acquisition, and examination of digital evidence in a manner that meets legal standards of proof and admissibility. The authors adopt a model of digital forensics risk assessment that quantifies an investigator’s experience with

 lead image by Diane Fleischer

eight crucial aspects of the digital forensics process. This research adds the concept of quantifying through a designed risk meter algorithm to calculate digital forensics risk indices. Numerical and/or cognitive data were pains- takingly collected to supply input parameters to calculate the quantitative risk index for the digital forensics process. Much needed risk management procedures and metrics are also appended.

Keywords: Cyberterrorism, cybercriminality, risk meter

154 Defense ARJ, April 2016, Vol. 23 No. 2 : 152–177

A Publication of the Defense Acquisition University http://www.dau.mil

Digital forensics is a topic that has been popularized by television pro- grams such as CSI. Crime-solving glamour and drama aside, the reality is that the digita l forensics process is a highly technica l field that depends on the proper implementation of specif ic, well-accepted protocols a nd procedures. Inadequate forensic tools and technical examination, as well as lack of adherence to appropriate protocols and procedures, can result in evidence that does not meet legal standards of proof and admissibility. Digital forensics risk arises, for example, when personnel lack the proper tools to conduct investigations, fail to process evidentiary data properly, or do not follow accepted protocols and procedures.

Assessing and quantifying digital forensics risk is the goal of this article. To do so, the authors utilize a digital forensics risk meter, based on a series of questions designed to assess respondents’ perceptions of digital forensics risk. Based on the responses, a digital forensics risk index will be calculated. Where this approach differs is that other approaches typically provide gen- eral guidance in the form of best practices, classification schemes or, at best, a checklist for digital forensics procedures, and do not provide quantitative tools (based on game theory) for risk management and mitigation. Examples of other such approaches follow:

155Defense ARJ, April 2016, Vol. 23 No. 2 : 152–177

April 2016

• U.S. Department of Justice, Forensic Examination of Digital Evidence: A Guide for Law Enforcement (general guidelines and worksheets) (U.S. Department of Justice, 2004)

• Error, Uncertainty, and Loss in Digital Evidence (cer tainty levels) (Casey, 2002)

• Cyber Criminal Activity Analysis Models using Markov Chain for Digital Forensics (suspicion levels) (Kim & In, 2008)

• Two-Dimensional Evidence Reliability Amplification Process Model for D igital Forensics (ev idence reliabi lit y) (K hatir, Hejazi, & Sneiders, 2008)

• Building a D igital Fore n sic Laborator y: Establishing and Managing a Successful Facility (checklist) (Jones & Valli, 2011)

One approach that does employ quantification, Metrics for Network Forensics Conviction Evidence, is confined to network forensics—mostly measuring severity impact—and does not provide mitigation advice (Amran, Phan, & Parish, 2009). In that research article, the authors show “how security metrics can be used to sustain a sense of credibility to network evidence gathered as an elaboration and extension to an embedded feature of Network Forensics Readiness (NFR).” They then propose “a procedure of evidence acquisition in network forensics … then analyze a sample of a packet data in order to extract useful information as evidence through a formalized intu- itive model, based on capturing adversarial behavior and layer analysis, … apply the Common Vulnerability Scoring System—or CVSS metrics to show the severity of network attacks committed…”(p. 1).

The digital forensics risk meter presented in this article will provide objec- tive, automated, dollar-based risk mitigation advice for interested parties such as investigators, administrators, and officers of the court to minimize digital forensics risk. Figure 1 represents a decision tree diagram to assess risk; Figure 2 (with the Advice column on the right extracted from Figure B-1, Appendix B) represents sample mitigation advice generated from the respondents’ inputs. This article will not only present a quantitative model, but will generate a prototype numerical index that facilitates appropriate protocols and procedures to ensure that legal standards of proof and admis- sibility are met.

156 Defense ARJ, April 2016, Vol. 23 No. 2 : 152–177

A Publication of the Defense Acquisition University http://www.dau.mil

FIGURE 1. DIGITAL FORENSICS RISK DIAGRAM

Protocols & Procedures

Mission Statement

Personnel

Administrative

Service Request/Intake

Case Management

Evidence Handling/ Retention

Case Processing

Technical Procedures Development

Case Assessment

Onsite

Location Assessment Processing

Search Authority

Evaluation

Precautions

Protection

Preservation

Preparation

Physical Extraction

Logical Extraction

Timeframe Analysis

Data Hiding Analysis

Application/File Analysis

Ownership/Possession

Examiner Notes

Examiner Report

Findings Details/ Summation

Hardware

Software

Training

Funding

Jurisdiction

Search & Seizure

Admissibility

Victim Rights & Support

Court Preparation

Media

Victim Relations

Legal Aspects

Digital Forensics Tools

Documentation & Reporting

Evidence Examination

Digital Forensics

Risk

Evidence Acquisition

Evidence Assessment

157Defense ARJ, April 2016, Vol. 23 No. 2 : 152–177

April 2016

F IG

U R

E 2

. M E

D IA

N D

IG IT

A L

F O

R E

N S

IC S

R IS

K M

E T

E R

R E

S U

LT S

M IT

IG A

T E

D T

O 3

5 .8

3 %

V u

ln er

ab .

T h

re at

C M

&

LC M

R es

. R

is k

C M

&

LC M

R es

. R

is k

C h

an g

e O

p t

C o

st U

n it

C

o st

F in

al

C o

st A

d v

ic e

0 .2

2 0

0 4

2 0

.4 15

7 7

1 0

.3 2 5

0 0

0 0

.3 2 5

0 0

0

0 .6

7 5

0 0

0 0

.0 6

17 5

4 0

.6 7 5

0 0

0 0

.0 6

17 5

4

0 .2

3 7 7 5

4 0

.3 7 5

0 0

0 0

.3 7 5

0 0

0

0 .6

2 5

0 0

0 0

.0 3

2 6

9 7

0 .6

2 5

0 0

0 0

.0 3

2 6

9 7

0 .3

4 6

4 7 6

0 .5

5 0

0 0

0 0

.5 5

0 0

0 0

0 .4

5 0

0 0

0 0

.0 3

4 3

0 8

0 .4

5 0

0 0

0 0

.0 3

4 3

0 8

0 .3

17 11

1 0

.5 5

9 2 5

9 0

.4 5

0 0

0 0

0 .7

2 17

0 5

0 .2

7 17

0 5

$ 4

9 .7

7 In

c re

a se

t h

e

C M

c a p

a c it

y f

o r

th re

a t

“E x a m

in e r

N o

te s”

f o

r th

e

v u

ln e ra

b ili

ty o

f “D

o c u

m e n

ta ti

o n

&

R e p

o rt

in g

” fr

o m

4 5

.0 0

%

to 7

2 .17

% f

o r

a n

im

p ro

ve m

e n

t o

f 2 7.

17 %

0 .5

5 0

0 0

0 0

.0 9

7 5

4 1

0 .2

7 8

2 9

5 0

.0 4

9 3

5 5

0 .4

4 0

74 1

0 .3

7 5

0 0

0 0

.3 7 5

0 0

0

0 .6

2 5

0 0

0 0

.0 8

7 3

5 2

0 .6

2 5

0 0

0 0

.0 8

7 3

5 2

158 Defense ARJ, April 2016, Vol. 23 No. 2 : 152–177

A Publication of the Defense Acquisition University http://www.dau.mil

F IG

U R

E 2

. M E

D IA

N D

IG IT

A L

F O

R E

N S

IC S

R IS

K M

E T

E R

R E

S U

LT S

M IT

IG A

T E

D T

O 3

5 .8

3 %

, C O

N T

IN U

E D

V u

ln er

ab .

T h

re at

C M

&

LC M

R es

. R

is k

C M

&

LC M

R es

. R

is k

C h

an g

e O

p t

C o

st U

n it

C

o st

F in

al

C o

st A

d v

ic e

0 .4

6 2 8

4 7

0 .4

0 8

2 6

9 0

.7 2 5

0 0

0 0

.9 9

9 19

5 0

.2 74

19 5

$ 5

0 .2

3 In

c re

a se

t h

e C

M

c a p

a c it

y f

o r

th re

a t

“V ic

ti m

R ig

h ts

&

S u

p p

o rt

” fo

r th

e

v u

ln e ra

b ili

ty o

f “V

ic ti

m R

e la

ti o

n s”

fr

o m

7 2 .5

0 %

t o

9

9 .9

2 %

f o

r a n

im

p ro

ve m

e n

t o

f 2 7.

4 2 %

0 .2

7 5

0 0

0 0

.0 5

19 6

6 0

.0 0

0 8

0 5

0 .0

0 0

15 2

0 .2

5 0

6 4

6 0

.5 7 5

0 0

0 0

.5 7 5

0 0

0

0 .4

2 5

0 0

0 0

.0 4

9 3

0 5

0 .4

2 5

0 0

0 0

.0 4

9 3

0 5

0 .3

4 10

8 5

0 .7

2 5

0 0

0 0

.7 2 5

0 0

0

0 .2

7 5

0 0

0 0

.0 4

3 4

14 0

.2 7 5

0 0

0 0

.4 3

4 14

To ta

l C

h a n

g e

To ta

l C

o st

B re

a k

E ve

n

C o

st

To ta

l F

in a l

C o

st

5 4

.5 9

% $

10 0

.0 0

$ 1.

8 3

159Defense ARJ, April 2016, Vol. 23 No. 2 : 152–177

April 2016

F IG

U R

E 2

. M E

D IA

N D

IG IT

A L

F O

R E

N S

IC S

R IS

K M

E T

E R

R E

S U

LT S

M IT

IG A

T E

D T

O 3

5 .8

3 %

, C O

N T

IN U

E D

C h

a n

g e U

n it

C o

st C

ri ti

c a lit

y 1. 0

0 To

ta l R

is k

0 .4

5 8

3 3 7

To ta

l R

is k

0 .3

5 8

3 3 7

C a lc

u la

te F

in a l C

o st

C a p

it a l

$ 1, 0

0 0

.0 0

P e rc

e n

ta g

e 4

5 .8

3 3

6 7 0

P e rc

e n

ta g

e 3

5 .8

3 3

6 9

8 C

o st

P ri

n t

S u

m m

a ry

To ta

l T

h re

a t

N /A

F in

a l R

is k

0 .4

5 8

3 3 7

F in

a l R

is k

0 .3

5 8

3 3 7

C o

st s

P ri

n t

R e su

lt s

Ta b

le E

C L

$ 4

5 8

.3 4

E C

L $

3 5

8 .3

4 V

ie w

T h

re a t

A d

v ic

e C

h a n

g e

E C

L D

e lt

a $

10 0

.0 0

C o

st P

ri n

t S

in g

le T

h re

a t/

C M

S e le

c ti

o n

S h

o w

w h

e re

y o

u a

re i n

P

ri n

t A

d v ic

e T

h re

a t/

C M

S e le

c ti

o n

s S

e c u

ri ty

M e te

r

O p

ti m

iz e

P ri

n t

A ll

T h

re a t/

C M

S e le

c ti

o n

s

U p

d a te

S u

rv e y Q

u e st

io n

s

N o

te . C

M =

C o

u n

te rm

e a su

re ; E

C L

= E

x p

e c te

d C

o st

o f

L o

ss ; L

C M

= L

a c k o

f C

o u

n te

rm e

a su

re ; O

p t

= O

p ti

m iz

e t

o ; R

e s.

R is

k =

R e

si d

u a l R

is k ;

V u

ln e

ra b

. = V

u ln

e ra

b ili

ty .

160 Defense ARJ, April 2016, Vol. 23 No. 2 : 152–177

A Publication of the Defense Acquisition University http://www.dau.mil

Vulnerabilities, Threats, and Countermeasures

Based on industry best practices guidelines, such as the U.S. Department of Justice (2004) Forensic Examination of Digital Evidence: A Guide for Law Enforcement, eight specific vulnerabilities are assessed:

1. Protocols and Procedures

2. Evidence Assessment

3. Evidence Acquisition

4. Evidence Examination

5. Documentation and Reporting

6. Digital Forensics Tools

7. Legal Aspects

8. Victim Relations

Within each vulnerability category, questions pertain to specific threats and countermeasures. For example, within the Evidence Acquisition vulnera- bility, respondents are asked questions regarding precautions, protection, a nd preser vation threats a nd countermea sures. Within the Ev idence Exa mination v ulnerability, respondents a re asked questions rega rding preparation, physica l extraction, logica l extraction, timeframe ana lysis, data hiding analysis, application/file analysis, and ownership/possession threats and countermeasures. Within the digital forensics Tools vulnerabil- ity, respondents are asked questions regarding hardware, software, training, and funding threats and countermeasures. Figure 1 details these vulnera- bilities and threats. The responses are then used to generate a quantitative Digital Forensics risk index.

Assessment Questions Questions are designed to elicit responses regarding the perceived risk

to proper Digital Forensics procedures, evidence handling/examination, admissibility, and other associated issues from particular threats, as well as the countermeasures the respondents may employ to counteract those

161Defense ARJ, April 2016, Vol. 23 No. 2 : 152–177

April 2016

threats. For example, in the Evidence Examination vulnerability, questions regarding the data hiding analysis threat include both threat and counter- measure questions. Threat questions would include:

• Do file headers not correspond to file extensions?

• Did the suspect encrypt or password-protect data?

• Are hidden messages present?

• Are host-protected areas (HPA) present?

Countermeasure questions would include:

• Did the examiner correlate file headers to the corresponding file extensions to identify any mismatches that may indicate the user intentionally hid data?

162 Defense ARJ, April 2016, Vol. 23 No. 2 : 152–177

A Publication of the Defense Acquisition University http://www.dau.mil

• Did t he exa m i ner ga i n access to a l l pa ssword-protected, encr y pted, a nd compressed f i les, wh ich may i nd icate a n attempt to conceal the data from unauthorized users?

• Did the examiner conduct a thorough stenographic analysis?

• Did the examiner gain access to HPAs that may indicate an attempt to conceal data?

Sa mple v u l nera bi l it y ( E v idence A cqu i sit ion) a s ses sment ques t ion s employed in the dig ita l forensics risk meter a re found in Appendi x A . Appendi x A a lso cla rif ies a nd precludes conf usion bet ween Ev idence Acquisition and materiel acquisition. The first proactive step in any digi- tal forensic investigation is acquisition. The inherent problem with digital media is that it is readily modified just by accessing files. Working from a copy is one of the fundamental steps to making a forensic investigation auditable and acceptable to a court (Acquisition, n.d.).

Risk Calculation and Risk Management through Surveys

Based on their experience, the respondents a nswer yes or no to the survey questions. These responses are then used to calculate residual risk. Employing a game-theoretical mathematical approach, the calculated risk index is used to generate an optimization or lowering of risk to desired levels (Sa hinoglu, 2007, 2016). A more deta iled set of mitigation advice will be generated to show interested parties (such as inves- tigators, administrators, and officers of the court) where risk can be reduced to optimized or desired levels. An example of such risk reduction is shown in Fig ure 2, f rom 45.8 percent to 35.8 percent , which represents the media n response from the study participants (Sahinoglu, Cueva-Parra, & Ang, 2012). Figure 2 is an actual screenshot of a results table, representing the median digital forensics risk meter results displaying threat, countermeasures, residua l risk indices, optimization options, a nd risk mitigation advice. For this study, a random sample of responses from 27 survey par- ticipants was analyzed; their residual

163Defense ARJ, April 2016, Vol. 23 No. 2 : 152–177

April 2016

risk results are tabulated and presented in Appendix B. The survey portfo- lio used in this assessment and upon which this research article is based showed the complexity of the digital forensics field, encompassing tools, procedures, specific training, budget, and trial.

Dig ita l forensics has two crucia l phases (Appendix A). The f irst phase included a ll the forensics involved with the collection of data, while the second phase concerns defending the data collected, the means by which the data were collected, a nd cha in of custody applied from the origina l collection until court (Sahinoglu, Stockton, Morton, Barclay, & Eryilmaz, 2014). The initial goal was to obtain survey input from local city leaders in Montgomery, Alabama. Although individuals from the Governor’s Office, Montgomery Police Department, and District Attorney’s office were will- ing to assist, our short timeframe and their busy schedules prevented their offices from providing input to the digital forensics survey. Fortunately, the authors had contacts at other law enforcement offices, which agreed to make personnel available for the survey and eventual follow-up. Eventually, three law enforcement offices and one special investigation/training organization participated and provided valuable input.

Our first objective was to explain the purpose of the survey and the potential value the combined results could offer each of the offices. At each location, participants included investigators, initia l responders, digita l forensics specia lists, a nd lega l exper ts (i.e., District Attorney Off ice personnel). The ra nge of exper tise of the pa r ticipa nts was inva luable, as each pro-

vided insight into an aspect of the survey that is often unique to a position within a department. Because of this range of expertise, the authors are confident they were able to capture the three main components of the sur vey por tion of the R isk-o-Meter (RoM). Perspectives from collection of evidence, packaging of evidence for trial, and presentation of evidence at trial were all given. Although the special investiga- tion/training organization had many fewer survey participants, they did offer a unique perspective, as they represented a n orga nization that focuses on training digital forensics experts for the military.

The resu lts were t hen r un for each pa r ticipa nt , determining the Initia l Repair Cost to Mitigate.

This was determined by using a Criticality of 1.0, Equipment Cost of $0.0, and a

164 Defense ARJ, April 2016, Vol. 23 No. 2 : 152–177

A Publication of the Defense Acquisition University http://www.dau.mil

Production Cost of $1,000. The median of all results was determined and then optimized through the RoM to determine the best “bang for the buck” that would reduce the participant’s Total Residual Risk by 10 percent. The initial Total Residual Risk for the median participant was 45.8 percent, with an Expected Cost of Loss (ECL) of $458.34. Once optimized, the Total Risk was reduced to 35.8 percent, and the ECL was reduced by $100 to a total ECL of $358.34 (Fig ure 2). The first optimized solution was to increase the countermeasure (CM) capacity for the “Examiner Notes” threat for the Documentation and Reporting vulnerability from 45.0 percent to 72.17 percent, for an improvement of 27.17 percent. The second optimized solution was to increase the CM capacity for the “Victim Rights and Support” threat for the Victim Relations vulnerability from 72.50 percent to 99.92 percent, for an improvement of 27.42 percent.

Table B-2 in Appendix B depicts a s e t o f c o n s t r a i n e d l i n e a r equations used within the body of t he r isk meter ’s innovative second-sta ge sof t wa re for the ga me -t heoret ic opt i m i z at ion necessar y to create the Advice column (shown on the right in Figure 2). The Advice column’s original survey calculations are depicted in Fig ure B -1, which displays company ECSO8: 14th Ranked Overall Median Survey. This is followed by Figure B-2, which displays company OPD1’s Group Media n Sur vey Ta ker’s

Origina l Sur vey Outcome; while Fig ure B-3 displays company AUPD5’s Group Median Survey Taker’s Original Survey Outcome. In each case, the company representative seemed impressed with the results and noted the results for possible future implementation. One organization actually com- mented that they had already begun looking into increases in at least one CM that was identified by the optimization. Clearly, this episode validated the tool and its usefulness in their eyes.

165Defense ARJ, April 2016, Vol. 23 No. 2 : 152–177

April 2016

Discussion and Conclusions The advantages of conducting business on the Internet have been well

documented. Conducting business online is frequently faster and cheaper than utilizing traditional methods. However, this comes with the digital forensics-related vulnerabilities and pertinent threats that tend to convert the positive adva ntages to clea r disadva ntages as a result of fraud a nd wrongdoing. With the advent of the Internet and burgeoning information systems, digital forensics has gained worldwide momentum. In every envi- ronment, the content of digital information relative to criminal undertakings and investigations alike has vastly increased, growing disproportionately to the capacities of state and local governments, as well as federal agencies and military components. The risk assessment, risk mitigation, or general risk management that involve planned investment policy in order of priority, with a sound and auditable, cost-effective approach, are missing links. The proposed digital forensics risk meter is an innovative initiative that provides a quantitative assessment of risk to the user as well as recommendations for mitigating that risk. This approach will be a highly useful tool to inter- ested parties such as investigators, company or system administrators, and officers of the court seeking to minimize and thereby mitigate digital foren- sics risk by leveraging and introducing early, preventive CMs identified as an outcome of this dynamic closed-end survey.

Additional future research by the principal author will involve the addition of cloud computing concerns such as service provider cooperation and data accessibility, as well as the incorporation of new questions so as to better refine user responses and subsequent calculation of risk and mitigation rec- ommendations. Minimization or mitigation of digita l forensics risk will greatly facilitate the success of digital forensics investigations, ensuring that legal standards of proof and admissibility are ultimately met. The digital forensics risk meter tool provides the means to identify areas where risk can

This approach will be a highly useful tool to interested parties such as investigators, company or system admin- istrators, and officers of the court seeking to minimize and thereby mitigate digital forensics risk by leveraging and introducing early, preventive CMs identified as an outcome of this dynamic closed-end survey.

166 Defense ARJ, April 2016, Vol. 23 No. 2 : 152–177

A Publication of the Defense Acquisition University http://www.dau.mil

be minimized, as well as giving the objective, dollar-based mitigation advice to do just that. This aspect of objective quantifiable risk assessment and man- agement will add to the trustworthiness of acquisition practices in terms of dependable Internet communications involving great quantities of materiel and their budgetary repercussions.

Limitations and Future Research The limitations are obvious due to input data deficiency, but methods

such as the one proposed in this article are a good way to start due to the objective, hands-off, automated, cost-effective treatment of the problem at hand. Sound assessment of digital forensics risk can result when informa- tion entered, from learned respondents, is as close to the truth as feasibly possible. The discussion that follows clarifies how this proposed work is directly relevant to acquisition reisk mitigation if applied appropriately within a system.

This research article is not focused on the usual law enforcement or digi- tal-policing procedures, but is directed towards greater awareness for the in-house (e.g., acquisition community) workforce as they manage already existing risk assessment and risk management algorithms. By leveraging the countermeasures outlined in this article (in particular, the Advice col- umn in Figure 2, which employs probability-estimation and game-theoretic risk computing), the authors anticipate that acquisition practitioners can better preclude future digital forensics breaches by taking timely CMs.

Law enforcement, in cooperation with the defense acquisition community, is increasingly becoming an important player in digital forensics, thereby lending increased scrutiny in this vital area. Law enforcement is more aware of evidence such as drug cartel activity and money laundering through all avenues such as export, import, and domestic acquisition activities. Even in homicide cases, much useful evidence can be deduced by using digital forensics information. In addition, digital forensics sciences not only can break a difficult case, but can do so quickly and inexpensively compared to police detectives’ usual time-tested, but tedious practices. The proposed risk meter software and its algorithm can successfully lead the way toward navigating the stages of cost-effective risk assessment and management.

In conclusion, the best “bang for the buck” derives from simple usability and scientific objectivity.

167Defense ARJ, April 2016, Vol. 23 No. 2 : 152–177

April 2016

References Acquisition. (n.d.). In Wikibooks. Retrieved from https://en.wikibooks.org/wiki/

Introduction_to_Digital_Forensics/Acquisition Amran, A. R., Phan, R. C. W., & Parish, D. J. (2009). Metrics for network

forensics conviction evidence. Proceedings of the International Conference for Internet Technology and Secured Transactions (ICITST), Institute of Electrical and Electronics Engineers (pp. 1–8), London, England. doi: 10.1109/ ICITST.2009.5402640

Casey, E. (2002, Summer). Error, uncertainty, and loss in digital evidence. International Journal of Digital Evidence, 1(2). Retrieved from https://utica. edu/academic/institutes/ecii/publications/articles/A0472DF7-ADC9-7FDE- C80B5E5B306A85C4.pdf

Jones, A., & Valli, C. (2011). Building a digital forensic laboratory: Establishing and managing a successful facility, Burlington, MA: Butterworth Heinemann & Syngress.

Khatir, M., Hejazi, S. M., & Sneiders, E. (2008). Two-dimensional evidence reliability amplification process model for Digital Forensics. Proceedings of the IEEE Third International Annual Workshop on Digital Forensics and Incidents Analysis (WDFIA 2008) (pp. 21–29), Malaga, Spain. doi: 10.1109/WDFIA.2008.11

Kim, D. H., & In, H. P. (2008). Cyber criminal activity analysis models using Markov chain for Digital Forensics. Proceedings of the 2nd International Conference on Information Security and Assurance (pp. 193–198), Busan, Korea. doi: 1109/ ISA.2008.90

Sahinoglu, M. (2007). Trustworthy computing: Analytical and quantitative engineering evaluation. Hoboken, NJ: John Wiley.

Sahinoglu, M. (2016). Cyber-risk informatics: Engineering evaluation with data science. Hoboken, NJ: John Wiley.

Sahinoglu, M., Cueva-Parra, L., & Ang, D. (2012, May-June). Game-theoretic computing in risk analysis. Wiley Interdisciplinary Reviews: Computational Statistics, 4(3), 227–248. doi: 10.1002/wics.1205. Retrieved from http://authorservices.wiley.com/ bauthor/onlineLibraryTPS.asp?DOI=10.1002/wics.1205&ArticleID=961931

Sahinoglu, M., Stockton, S., Morton, S., Barclay, R., & Eryilmaz, M. (2014, November 20). Assessing Digital Forensics risk: A metric survey approach. Proceedings of the SDPS 2014 Malaysia, 19th International Conference on Transformative Science and Engineering, Business and Social Innovation, Sarawak, Malaysia. Retrieved from https://www.researchgate.net/publication/268507819_ASSESSING_ DIGITAL_FORENSICS_RISK_A_METRIC_SURVEY_APPROACH

U.S. Department of Justice. (2004). Forensic examination of digital evidence: A guide for law enforcement. Retrieved from https://www.ncjrs.gov/pdffiles1/nij/ 199408.pdf

168 Defense ARJ, April 2016, Vol. 23 No. 2 : 152–177

A Publication of the Defense Acquisition University http://www.dau.mil

Appendix A Sample Vulnerability (Evidence Acquisition, Documentation and Reporting, and Victim Relations) Assessment Questions

(in XML format) and Survey Template <survey> <vulnerability title= “Evidence Acquisition” level= “0”> <vQuestion> Are special precautions not taken to preserve digital evidence? </vQuestion> <vQuestion> Was write protection not utilized to preserve and protect original evidence? </vQuestion> <vQuestion> Was digital evidence not secured in accordance with departmental guidelines? </vQuestion> <vQuestion> Was speed the primary concern when it came to acquiring digital evidence? </vQuestion>

<threat title = “Precautions”> <tQuestion> Was evidence on storage devices destroyed or altered? </tQuestion> <tQuestion> Was equipment damaged by static electricity and magnetic fields? </tQuestion> <tQuestion> Was the original internal configuration of storage devices and hardware unnoted? </tQuestion> <tQuestion> Were investigators unable to provide drive attributes? </tQuestion>

<threat title = “Protection”> <tQuestion> Was CMOS/BIOS information not captured? </tQuestion> <tQuestion> Was the computer’s functionality and the forensic boot disk not tested? </tQuestion> <tQuestion> Did the forensic boot disk not boot? </tQuestion> <tQuestion> Did the investigators not collect drive configuration information from the CMOS/BIOS? </tQuestion>

<threat title = “Preservation”> <tQuestion> Did the investigators not perform the acquisition using the examiner’s system? </tQuestion> <tQuestion> Was a RAID present in the subject system? </tQuestion> <tQuestion> Was host-specific data not captured? </tQuestion> <tQuestion> Was successful acquisition not verified? </tQuestion>

</threat> </vulnerability> </survey

169Defense ARJ, April 2016, Vol. 23 No. 2 : 152–177

April 2016

DIGITAL FORENSICS RISK SURVEY

This survey has 8 main categories of vulnerabilities. Please identify the areas below where you have observed vulnerabilities while involved with digital forensics activities within your organization. * A minimum of 2 categories must be chosen:

Vulnerability Area Reference Page  Protocols & Procedures Pages 1 & 2

 Evidence Assessment Pages 3 & 4

 Evidence Acquisition Page 5

 Evidence Examination Pages 6 & 7

 Documentation & Reporting Page 8

 Digital Forensics Tools Page 9

 Legal Aspects Page 10

 Victim Relations Page 11

DIRECTIONS:

This Page: • Select all vulnerability areas that apply • Proceed to appropriate pages to complete survey for each vulnerability

area

Survey Page(s):

Vulnerability • Rate Vulnerability (0.1–10) with 10 being most vulnerable and 0.1 being

least vulnerable • Select all vulnerability statements that apply (must choose at least one)

Threat • Rate Threat (0.1–10) with 10 being greatest threat and 0.1 being the least

threat • Using square check box, select all threat statements that apply to each

threat category chosen (must choose at least one)

Countermeasure • Rate associated Countermeasure for each threat category chosen above

(0.1–10) with 0.1 being least effective and 10 being the most effective countermeasure

• Using square check box, select all countermeasure statements that apply (must choose at least one)

170 Defense ARJ, April 2016, Vol. 23 No. 2 : 152–177

A Publication of the Defense Acquisition University http://www.dau.mil

Rate (01.–10) if vulnerability applies

Vulnerability: Legal Aspects Must select one (minimum) for each Vulnerability selected

Rate (0.1–10) for all Threats that apply

 Legal authority for forensic examinations is unclear

 The extent of the authority to search is unstated

 Courtroom admissibility is not a prime consideration

Threat: Jurisdiction Countermeasures

 There is conflicting jurisdiction  Jurisdiction is established among agencies prior to investigations

 Multiple jurisdictions are often involved  Investigators and other officials from different areas coordinate and cooperate on cases

 Potential evidentiary data are stored on the cloud or some other distant network resource

 Court orders are obtained when requiring distant service providers to provide potentially evidentiary data

 Cases often cross international borders  There are bilateral or multilateral agreements that facilitate cooperation with foreign law enforcement agencies

Threat: Search & Seizure Countermeasures

 Cases are often challenged for lack of probable cause

 Forensic investigators unequivocally identify and articulate a probable cause necessary to obtain search warrants

 On-site investigators often proceed without knowledge of a warrant

 Search warrants are obtained prior to investigation on site

 Investigators go beyond warrants originally used to assert search authority

 New search warrants are obtained as new evidence is uncovered to avoid charges of “stale” warrants

 The evidentiary chain of custody is often challenged

 Full documentation of the evidentiary chain of custody is maintained throughout the investigation

Threat: Admissibility Countermeasures

 Digital evidence is sometimes changed by seizure

 Strict measures are taken to ensure that when seizing digital evidence, the action does not change that evidence

 Individuals besides forensic investigators access original digital evidence

 Only forensically competent persons are allowed access to original digital evidence

 Does activity related to cases come under legal/judicial review

 All activities related to seizures, access, storage, or transfer of digital evidence are fully documented, preserved, and available for legal/judicial review

 The state of evidence is often unknown prior to opening files

 Evidence is “frozen” prior to opening the files

Must select one (minimum) Threat for each vulnerability selected

171Defense ARJ, April 2016, Vol. 23 No. 2 : 152–177

April 2016

Appendix B Respondent Results Tabulations

TABLE B-1. COMPANIES’/RESPONDENTS’ (AFIT, AUPD, ECSO, OPD) SURVEY RESULTS FOR DIGITAL FORENSICS RISK METER STUDY

Survey Taker

Residual Risk %

Ranked Overall (Out of 27) Remarks

AFIT1 52.47 6th 2nd out of 4 within AFIT

AFIT2 49.90 9th 3rd out of 4 within AFIT

AFIT3 52.71 5th 1st out of 4 within AFIT

AFIT4 47.64 10th 4th out of 4 within AFIT

AUPD1 31.15 26th 7th out of 7 within AUPD

AUPD2 39.67 20th 5th out of 7 within AUPD

AUPD3 50.02 8th 1st out of 7 within AUPD

AUPD4 36.98 21st 6th out of 7 within AUPD

AUPD5 44.59 16th ~ Overall Average 4th out of 7 within AUPD

AUPD6 46.06 13th 3rd out of 7 within AUPD

AUPD7 47.06 11th 2nd out of 7 within AUPD

ECSO1 51.80 7th 5th out of 9 within ECSO

ECSO2 46.66 12th 6th out of 9 within ECSO

ECSO3 56.94 2nd 2nd out of 9 within ECSO

ECSO4 57.67 1st 1st out of 9 within ECSO

ECSO5 54.87 3rd 3rd out of 9 within ECSO

ECSO6 41.36 19th 9th out of 9 within ECSO

ECSO7 54.84 4th 4th out of 9 within ECSO

ECSO8 45.83 14th Overall Average 7th out of 9 within ECSO

ECSO9 45.01 15th 8th out of 9 within ECSO

OPD1 35.00 23rd 4th out of 7 within OPD

OPD2 42.56 18th 2nd out of 7 within OPD

OPD3 44.35 17th 1st out of 7 within OPD

OPD4 33.39 25th 6th out of 7 within OPD

OPD5 28.23 27th 7th out of 7 within OPD

OPD6 34.39 24th 5th out of 7 within OPD

OPD7 36.41 22nd 3rd out of 7 within OPD

Note. Respondents are ranked within and overall, where Median is 45.83% (ECSO8) and Average is 44.73% (AUPD5: 44.49% is the closest respondent to 44.7%).

172 Defense ARJ, April 2016, Vol. 23 No. 2 : 152–177

A Publication of the Defense Acquisition University http://www.dau.mil

TABLE B-2. SET OF CONSTRAINED LINEAR EQUATIONS FOR TABLE B-1’S MEDIAN

Min COLLOSS (Column loss), s. t. (subject to): CM

11 < 1 (1), CM

12 < 1 (2), CM

13 < 1 (3), CM

21 <1 (4), CM

22 <1 (5), CM

31 <1 (6),

CM 32

<1 (7), CM 33

<1 (8), COLLOSS <1 (9)

CM 11 > 0.675 (10), CM

12 > 0.475 (11), CM

13 > 0.725 (12),

CM 21 > 0.725 (13), CM

22 > 0.725 (14),

CM 31 > 0.675 (15), CM

32 > 0.675 (16), CM

33 > 0.675 (17),

0.09148 CM 11 -1COLLOSS < 0 (18), 0.05231 CM

12 -1COLLOSS < 0 (19),

0.07629 CM 13 -1COLLOSS < 0 (20), 0.17734 CM

21 -1COLLOSS < 0 (21),

0.13966 CM 22

-1COLLOSS < 0 (22), 0.18896 CM 31 - 1COLLOSS < 0 (23),

0.11601 CM 32

-1COLLOSS < 0 (24), 0.15787 CM 33

-1COLLOSS < 0 (25),

0.09148 CM 11 + 0.05231 CM

12 + 0.07629 CM

13 + 0.17734 CM

21 + 0.13966 CM

22 +

0.18896 CM 31 + 0.11601 CM

32 + 0.15787 CM

33 > 1- 0.3583 = 1- 0.3583 = 0.6417 (26)

Note. Used to attain a risk mitigated to 35.83% from an undesirable 45.83% inspired by Figure 2; where Total # Constraints = 3 * #Selected Threats + 2 = 3 * 8 + 2 = 24 + 2 = 26 along with Objective(Min).

173Defense ARJ, April 2016, Vol. 23 No. 2 : 152–177

April 2016

F IG

U R

E B

-1 . E

C S

O 8

: 14

T H

R A

N K

E D

O V

E R

A L

L M

E D

IA N

S U

R V

E Y

T A

K E

R ’S

O R

IG IN

A L

S U

R V

E Y

O U

T C

O M

E

V B

v b

T h

re at

th re

at LC

M R

is k

P o

st %

P o

st v

b >

P ro

to co

ls a

n d

P

ro ce

d u

re s

0 .2

2 0

0 4

2 P

e rs

o n

n e l

0 .4

15 7 7

1 0

.6 7 5

0 0

0 0

.0 6

17 5

4 0

.13

A d

m in

is tr

a ti

ve 0

.2 3 7 7 5

4 0

.6 2 5

0 0

0 0

.0 3

2 6

9 7

0 .0

7

S e rv

ic e R

e q

u e st

/I n

ta ke

0 .3

4 6

4 7 6

0 .4

5 0

0 0

0 0

.0 3

4 3

0 8

0 .0

7 0

.2 8

0 9

2 6

!

D o

c u

m e n

ta ti

o n

a n

d R

e p

o rt

in g

0 .3

17 11

1 E

x a m

in e r

N o

te s

0 .5

5 9

2 5

9 0

.5 5

0 0

0 0

0 .0

9 7 5

4 1

0 .2

1

E x a m

in e r

R e p

o rt

0 .4

4 0

74 1

0 .6

2 5

0 0

0 0

.0 8

7 3

5 2

0 .19

0 .4

0 3

4 0

1 !

V ic

ti m

R e la

ti o

n s

0 .4

6 2 8

4 7

V ic

ti m

R ig

h ts

a n

d S

u p

p o

rt 0

.4 0

8 2 6

9 0

.2 7 5

0 0

0 0

.0 5

19 6

6 0

.11

C o

u rt

P re

p a ra

ti o

n 0

.2 5

0 6

4 6

0 .4

2 5

0 0

0 0

.0 4

9 3

0 5

0 .11

M e d

ia 0

.3 4

10 8

5 0

.2 7 5

0 0

0 0

.0 4

3 4

14 0

.0 9

0 .3

15 6

7 3

C ri

ti ca

li ty

1. 0

0

C a p

it a l

C o

st $

1, 0

0 0

.0 0

To ta

l T

h re

at C

o st

s N

/A

R e

s- R

is k *

C ri

ti ca

li ty

0 .4

5 8

3 3

7

To ta

l R

e s-

R is

k 0

.4 5

8 3

3 7

E xp

e ct

e d

C o

st o

f L

o ss

$ 4

5 8

.3 4

C u

st . G

u e

ss R

e s-

R is

k 0

.5 0

174 Defense ARJ, April 2016, Vol. 23 No. 2 : 152–177

A Publication of the Defense Acquisition University http://www.dau.mil

F IG

U R

E B

-2 . O

P D

1: G

R O

U P

M E

D IA

N S

U R

V E

Y T

A K

E R

’S O

R IG

IN A

L S

U R

V E

Y O

U T

C O

M E

V B

v b

T h

re at

th re

at LC

M R

is k

P o

st %

P o

st v

b >

E v id

e n

ce

A ss

e ss

m e n

t 0

.3 0

9 5

2 4

O n

si te

0 .5

8 5 7

14 0

.4 5

0 0

0 0

0 .0

8 15

8 2

0 .2

3

E v a lu

a ti

o n

0 .4

14 2 8

6 0

.4 5

0 0

0 0

0 .0

5 7 7 0

4 0

.16 0

.3 9

7 9

6 1

!

D ig

it a l F

o re

n si

c s

To o

ls 0

.2 4

7 2 5

3 S

o ft

w a re

0 .4

2 2 2 2 2

0 .5

5 0

0 0

0 0

.0 5 74

18 0

.16

T ra

in in

g 0

.5 7 7 7 7 8

0 .3

2 5

0 0

0 0

.0 4

6 4

2 9

0 .13

0 .2

9 6

7 0

5 !

V ic

ti m

R e la

ti o

n s

0 .4

4 3

2 2 3

V ic

ti m

R ig

h ts

a n

d S

u p

p o

rt 0

.4 3

8 8

8 9

0 .2

5 0

0 0

0 0

.0 4

8 6

3 1

0 .14

C o

u rt

P re

p a ra

ti o

n 0

.2 5

8 3

3 3

0 .4

5 0

0 0

0 0

.0 5

15 2 5

0 .15

M e d

ia 0

.3 0

2 7 7 8

0 .0

5 0

0 0

0 0

.0 0

6 7

10 0

.0 2

0 .3

0 5

3 3

3

C ri

ti ca

li ty

1. 0

0

C a p

it a l

C o

st $

1, 0

0 0

.0 0

To ta

l T

h re

at C

o st

s N

/A

R e

s- R

is k *

C ri

ti ca

li ty

0 .3

4 9

9 9

8

To ta

l R

e s-

R is

k 0

.3 4

9 9

9 8

E xp

e ct

e d

C o

st o

f L

o ss

$ 3

5 0

.0 0

C u

st . G

u e

ss R

e s-

R is

k 0

.5 0

175Defense ARJ, April 2016, Vol. 23 No. 2 : 152–177

April 2016

F IG

U R

E B

-3 . A

U P

D 5

: G

R O

U P

M E

D IA

N S

U R

V E

Y T

A K

E R

’S O

R IG

IN A

L S

U R

V E

Y O

U T

C O

M E

V B

v b

T h

re at

th re

at LC

M R

is k

P o

st %

P o

st v

b >

P ro

to co

ls a

n d

P

ro ce

d u

re s

0 .16

2 12

1 A

d m

in is

tr a ti

ve 0

.2 2 5

0 0

0 0

.6 5

0 0

0 0

0 .2

3 7

10 0

.0 5

S e rv

ic e R

e q

u e st

/I n

ta ke

0 .2

8 5

4 17

0 .5

0 0

0 0

0 0

.0 2 3

13 6

0 .0

5

C a se

M a n

a g

e m

e n

t 0

.2 14

5 8

3 0

.6 7 5

0 0

0 0

.0 2 3

4 8

2 0

.0 5

C a se

P ro

ce ss

in g

0 .2

7 5

0 0

0 0

.6 7 5

0 0

0 0

.0 3

0 0

9 4

0 .0

7 0

.2 2 5

2 11

!

E v id

e n

ce

E x a m

in a ti

o n

0 .2

0 3

0 3

0 P

h y si

c a l E

x tr

a c ti

o n

0 .5

0 0

0 0

0 0

.6 5

0 0

0 0

0 .0

6 5

9 8

5 0

.15

D a ta

H id

in g

A n

a ly

si s

0 .5

0 0

0 0

0 0

.4 5

0 0

0 0

0 .0

4 5

6 8

2 0

.10 0

.2 5

0 4

2 8

!

D o

c u

m e n

ta ti

o n

a n

d R

e p

o rt

in g

0 .2

19 19

2 E

x a m

in e r

N o

te s

0 .5

0 0

0 0

0 0

.4 5

0 0

0 0

0 .0

4 9

3 18

0 .11

E x a m

in e r

R e p

o rt

0 .5

0 0

0 0

0 0

.6 2 5

0 0

0 0

.0 6

8 4

9 7

0 .15

0 .2

6 4

2 18

!

L e g

a l A

sp e c ts

0 .13

2 3

2 3

S e a rc

h a

n d

S e iz

u re

1. 0

0 0

0 0

0 0

.3 2 5

0 0

0 0

.0 4

3 0

0 5

0 .10

0 .0

9 6

4 4

5

V ic

ti m

R e la

ti o

n s

0 .2

8 3

3 3

3 V

ic ti

m R

ig h

ts a

n d

S

u p

p o

rt 0

.3 10

0 9

6 0

.2 7 5

0 0

0 0

.0 2 4

16 2

0 .0

5

C o

u rt

P re

p a ra

ti o

n 0

.3 4

7 5

5 6

0 .2

2 5

0 0

0 0

.0 2 2 15

7 0

.0 5

M e d

ia 0

.3 4

2 3

4 8

0 .2

7 5

0 0

0 0

.0 2 6

6 7 5

0 .0

6 0

.16 3

6 9

7

C ri

ti ca

li ty

1. 0

0

C a p

it a l

C o

st $

1, 0

0 0

.0 0

To ta

l T

h re

at C

o st

s N

/A

R e

s- R

is k *

C ri

ti ca

li ty

0 .4

4 5

9 0

3

To ta

l R

e s-

R is

k 0

.4 4

5 9

0 3

E xp

e ct

e d

C o

st o

f L

o ss

$ 4

4 5

.9 0

C u

st . G

u e

ss R

e s-

R is

k 0

.5 0

176 Defense ARJ, April 2016, Vol. 23 No. 2 : 152–177

A Publication of the Defense Acquisition University http://www.dau.mil

Biographies

Dr. Mehmet Sahinoglu is the founding director of the Informatics Institute and Cybersystems and Information Security Graduate Program at Auburn University at Montgomery. Formerly the Eminent Scholar and Chair-Professor at Troy University’s Computer Science Department, he holds a BS and MS in Electrical and Computer E n g i n e e r i n g f r o m M i d d l e E a s t Te c h n i c a l University-Ankara and University of Manchester Instit ute of Science a nd Technolog y, United Kingdom, respectively; and a PhD in Electrical and Computer Engineering and Statistics from Texas A&M, jointly. Dr. Sahinoglu conducts research in Cyber-Risk Informatics. He is the author of Trustworthy Computing (2007) and Cyber-Risk Informatics: Engineering Evaluation with Data Science (2016)—both with Wiley Interscience.

(E-mail address: [email protected])

MSg t Stephen Stockton, USAF (Ret.), has over 25 years in the IT field. He has developed, sustained, and operated military information sys- tems. His experience includes application design, development, software lifecycle management, and software security. He has BS and MS degrees in Computer Science from Saint Leo University and Troy State University. He is currently enrolled in the Cybersystems and Information Security M a s t er ’s P r og r a m at Aubu r n Un iver sit y at Montgomery. MSgt Stockton worked as a senior software engineer for General Dynamics after retiring from the USAF and now serves as an acqui- sition program manager at Maxwell-Gunter AFB.

(E-mail address: [email protected])

177Defense ARJ, April 2016, Vol. 23 No. 2 : 152–177

April 2016

Capt Robert M. Barclay, USAF (Ret.), is cur- rently a part-time research and teaching associate at Auburn University at Montgomery, and he is the IT security manager for the State of Alabama’s Unified Judicial System, responsible for network security for the State Courts since 2009. He was previously employed by General Dynamics, and he was also employed by Troy State University at Montgomery for IT security and distance learning. He has 33 years of combined military and civilian service in IT security and related forensics expe- rience. He holds a BS in Information Systems Management and is currently pursuing an MS in C y b er se c u r it y, b ot h f rom t he Un iver sit y of Maryland.

(E-mail address: [email protected])

Mr. Scott Morton is a part-time research asso- ciate at Auburn University at Montgomery and a dju nc t profes sor on C yber secu r it y a nd C S Programming at Troy University Montgomery campus and South University in Montgomery. He holds an MS in Computer Science with summa cum laude from Troy University Montgomery and a B A i n Inter nationa l R elations f rom Joh ns Hopkins Universit y. He currently resea rches Cybersystem Secur it y R isk A ssessment a nd Management.

(E-mail address: [email protected])

This content is in the Public Domain.