10 QUESTIONS TO KNOW IF YOUR CHILD TODAY MIGHT BE THE MOST WANTED CRIMINAL TOMORROW

10 QUESTIONS TO KNOW IF YOUR CHILD TODAY MIGHT BE THE MOST WANTED CRIMINAL TOMORROW

FADI ABU ZUHRI

INTRODUCTION

A multitude of studies have found that kids with “callous and unemotional” traits (CU traits) are more likely than other kids (three times more likely, in one study) to turn into criminals, or display psychopathic qualities in their adult life (Hagerty, 2017). Such children are found to be uncaring, shallow in their emotions and insincere. This article tries to provide your some simple questions that could help you identify such traits. Any parent should be vigilant to their child’s behaviour and identify these risk factors as early as possible. Studies have been done on children right from babies to adolescent teenagers. Here are the findings.

PSEUDO-SCIENTIFIC TECHNIQUES FOR BEHAVIOURAL ASSESSMENT

There are several tests called “projective tests” and are designed to allow a person to react to a certain stimuli, and their response reveals hidden emotions or internal conflicts. These tests believe that humans have conscious and unconscious motivations. Such tests are indirect, reduce the chances of someone faking a response and primarily do not depend on verbal abilities. Some of these tests are “Draw-A-Person Test”, which is a psychological projection test for children; “The Hand Test”, which uses cards; “Szondi test”, which is nonverbal personality test to measure hsyteria, paranoia and maniac tendencies (Soley & Smith, 2008). In addition to psychological analysis, these tests find applications in recruitment, marketing and business.

Numerology and Graphology are other popular pseudo-scientific techniques. Numerology is based on the idea that human life is guided by numbers. Its origins date as far back as to civilisations like Babylon, Egypt, China, India and Greece. A numerologist is known to assess a person’s personality with things like your birthdate and full name (Buchanan, 2015).

Graphology analyses the patterns of handwriting to identify the personality characteristics of the writer. Graphology is different from graphonalysis, which involves forensic document examination to identify authorship through comparisons with a known standard. Graphology finds applications in recruitment where it can complement, but not replace regular hiring tools. Graphology has also been used in psychological analysis, marital compatibility and medical diagnosis (Friedman & Schustack, 1999).

Although these tests have found success, there are questions on their validity since they need to be administered by qualified psychologists and test interpretation is highly subjective (Husbands, 1993).

TESTS ON BABIES

The Babylab has done some interesting tests on 3-5 month babies. The tests were non-verbal and studies the babies’ ability to know good and bad. Five month babies seemed to have a higher sense of morality as compared to 3 months old babies. This probably suggests that (1) babies are born with a sense of morality (2) the sense of morality fades as they grow older (Adkin, 2017; Wynn & Bloom, 2013).

Surprisingly, a simple “red ball test” can assess the emotions of babies (Buglar, 2015). Psychologists used a red ball to track the visual preferences of 213 five-week-old babies, to see if they preferred interacting with an object or a human face. Those who favoured the ball displayed more callous traits two and a half years later. It’s too early to tell whether the child’s visual preference for a red ball over a human face is linked to psychopathic traits (Bedford, Pickles, Sharp, Wright, & Hill, 2015).

QUESTIONS FOR CHILDREN 3-4 YEARS OLD

Empathy, unlike sympathy, is the ability to feel and understand someone else’s emotions. A lack of empathy, remorse, or guilt, shallow emotions, aggression, acts of cruelty and indifference to punishment is seen as predictors of developing psychopathic traits in later life (Hagerty, 2017).

Ask yourself these questions:

  1. Does your child feels bad or guilty when he/she does something wrong?
  2. Does your child feel empathy towards those hurt? Is your child unconcerned about the feelings of others?
  3. Does your child try to help when someone is hurt? Is your child selfish or won’t share with others? For example, gives a toy.
  4. Do you feel the child is unable to connect emotionally with his/ her peers, family members, etc.? Does your child seem unresponsive to affection? Does your child show little affection toward people?

Examples of excessive behaviour include ripping the head off his/ her favourite teddy bear, slashing the tires on the family car, starting fires, and killing a pet.

  1. Did you notice any unusual or excessive aggression and cruelty in any behaviour? Is your child cruel to animals?

Indifference suggests a lack of interest or emotion.

  1. Does your child seem to be indifferent to punishment? Punishment doesn’t change his/her behaviour?
  2. Does your child show too little fear of getting hurt?

QUESTIONS FOR CHILDREN 8-10 YEARS

Committing a crime, even when alone, reflects an interior impulse toward harm.

  1. Does the child commit crime while alone without the pressure of peers?

Different types of crime show criminal versatility and are linked to future psychopathy.

  1. Does the child commit different types of crime in different settings?

Kiehl (2015) scanned the brains of inmates at high security prisons to understand the differences between regular convicts and psychopaths. The study identified two abnormalities, which could also occur in the brains of callous children. The first abnormality was noticed in the limbic system, the part that processes emotions. A psychopath’s brain has less grey matter in this are suggesting that the person does not feel or recognise fear in other people’s faces. In other words, they are cold-hearted (Kiehl, 2015).

  1. Can your child understand the emotions of people by looking at their face, especially those faces that display fear or sadness?

The second characteristic of a psychopath is the craving for excitement or rewards is too high. For example, a callous unemotional kid would keep going in a game (video games included) until they lose everything. They are also less likely to learn from their mistakes.

  1. Does your child exhibit the tendency to lose everything he/ she has in a game? Is your child willing to forgo short-term pleasure for long-term gratification?

CONCLUSION

There are several tests done on children and adults to identify psychopathic traits. These traits tend to highlight that a callous and unemotional person is more likely to become a psychopath. This does not, however, mean that every aggressive child is a future criminal. But statistics suggest the chances are high.

Something as harmless as feeding your child dairy, gluten, soda and sugar could have long-term damage to their emotional development. According to Dr. Bob, the drugless doctor, states that children having diet with high levels of trans fat (partially hydrogenated fat) are at a higher risk of developing depression and ADHD (Attention Deficiency and Hyperactivity disorder) (Dr.Bob, 2017).

As a parent you need to be aware of your children’s behaviour and how well they handle emotions. In case of extreme behaviours it is wise to consult a child psychologist.

1.REFERENCES

2.Adkin, R. (2017, October 30). Which Puppet Do You Like More? Baby Morals. Retrieved 2017, from https://www.youtube.com/watch?v=-LjiNNA7xBE&feature=youtu.be

3.Bedford, R., Pickles, A., Sharp, H., Wright, N., & Hill, J. (2015). Reduced Face Preference in Infancy: A Developmental Precursor to Callous-Unemotional Traits? Biological Psychiatry , 78 (2), 144–150.

4.Buglar, T. (2015, September 8). ‘Red ball test’ tells if babies will be psychopaths. Retrieved 2017, from The Scotsman: http://www.scotsman.com/news/red-ball-test-tells-if-babies-will-be-psychopaths-1-3880356

5.Dr.Bob. (2017). The ADHD and Trans Fat Link. Retrieved 2017, from http://druglessdoctor.com/food/non-dr-bob-approved/trans-fat/the-adhd-and-trans-fat-link/

6.Friedman, H. S., & Schustack, M. W. (1999). Personality, Classic Theories and Modern Research (5th ed.). Allyn and Bacon.

7.Hagerty, B. B. (2017, June). When Your Child Is a Psychopath. Retrieved 2017, from The Atlantic: https://www.theatlantic.com/magazine/archive/2017/06/when-your-child-is-a-psychopath/524502/

8.Husbands, R. (1993). Workers’ Privacy Part III: Testing in the workplace. Conditions of work digest , 12 (2).

9.Kiehl, K. A. (2015). The Psychopath Whisperer. New York: Crown Publishing Group.

10.Merchant, G. (2012, October 17). Should We Screen Kids’ Brains and Genes To ID Future Criminals? Retrieved 2017, from http://www.slate.com/articles/technology/future_tense/2012/10/should_kids_brains_and_genes_be_screened_to_detect_future_criminals.html

11.Soley, L., & Smith, A. L. (2008). Projective Techniques for Social Science and Business Research. Milwaukee: The Southshore Press.

YOUR CHILD TODAY MIGHT BE THE MOST WANTED CRIMINAL TOMORROW

YOUR CHILD TODAY MIGHT BE THE MOST WANTED CRIMINAL TOMORROW

FADI ABU ZUHRI

INTRODUCTION

Research suggests that any child including yours might turn into the most wanted criminal tomorrow. It’s more than likely that an adult serial killer or any other adult criminal has had an abnormal, dysfunctional and inappropriate mental process during his or her childhood (Slot & Hoeve, 2016). You might be breeding future potential criminals if you fail to help your child form meaningful bonds, fail to address predictors of violence at an early age, fail to address externalizing behaviour, fail to address certain personality traits at early childhood that are linked to malevolent behaviour and criminality; fail to instil self-control during childhood, and help the child learn to challenge cognitive distortions.

EXTERNALIZING BEHAVIOUR AND VIOLENT

Aggression, crime, and hyperactivity are collectively referred to as externalizing behaviour. Aggression is a conduct disorder consisting of verbal or physical behaviours that threaten to harm or harm others, including animals, adults and children. This externalizing behaviour may be self-protective and appropriate or destructive to others and self (Foshee & Bauman, 1992). Studies have identified this externalizing behaviour to strongly predict adult violence and crime and convictions (Moffitt, 1993). For example, aggressive behaviours at ages 6 to 13 are thought to predict later violence among boys. Studies have also revealed that continuity in the child’s antisocial behaviour predicts aggression, which predict later violent crime (Loeber & Hay, 1996). In another study involving African American boys, McCord and Ensminger (1995), found that nearly 50% of 6-year-olds who were identified with aggressive and hyperactivity behaviours were arrested for committing violent crimes at age 33. On the contrary, only one-third of the non-aggressive boys were arrested for the same crimes (McCord & Ensminger, 1995).

Crime is a heterogeneous concept reflecting diverse antisocial actions, including robbery, drug use, violence, vandalism, burglary and theft (Achenbach, 1978). Like aggression, crime in early childhood is believed to predict adult violence, crime and convictions (Tolan & Thomas, 1995). In addition, early onset of crime and violence has been linked to more chronic and serious violence (Tolan & Thomas, 1995). In Farrington’s (1995) study, one-half of boys aged between 10 and 16 initiated into early violence and crime were convicted of committing violent crimes by age 25. Conversely, only 8% of Juveniles aged from 10 to 16 were not initiated early in crime were convicted of crimes at age 25.

Hyperactivity is a conduct and externalizing behaviour. Children with this disorder are seriously impaired. These children have social adjustment problems in adulthood. They are more likely to grow into psychopaths, an antisocial behaviour characterized by blunted affect, lack of guilt and remorse, attention problems, irresponsible behaviour and impulsivity (Ou & Reynolds, 2010). This antisocial behaviour and other problems predispose these children to violence and crime later in life.

PERSONALITY AND CRIMINAL BEHAVIOUR

A child’s personality has been identified as a factor that predisposes him or her to violence and crime in adulthood. The three personality traits, which are captured in Eynseck’s PEN model, namely Psychoticism, Extraversion and Neuroticism, have been linked to criminality (Romero, Luengo, & Sobral, 2001; Robinson, 2004). For example, Daderman (1999) revealed that delinquents had higher score in PEN variables compared to the control group (non-delinquents).

Again, neuroticism scores are thought to reflect antisocial behaviour, impulsive behaviour and emotional instability (Blackburn, 1993). Individuals with psychoticism personality traits are characterized by hostility, cruelty, low empathy, impulsivity, socialization deficit, aggressiveness, and psychopathy (Blackburn, 1993). These characteristics are identified with delinquents and criminals (Blackburn, 1993). High scores of neuroticism and psychoticism have also been associated with juvenile crime, which is thought to predict criminal behaviour later in life. Others studies have positively related juvenile crime with extraversion and psychoticism (Heaven, 1996).

High Impulsive Sensation Seeking (ImpSS) scores have been associated with criminal behaviour. It is argued that individuals with high score of ImpSS are used to social unacceptable and risky activities. This involvement in criminal activities arises from sensation seeking and searching for high arousal (Buker, 2011). For example, studies (Cernovsky, O’Reilly, & Pennington, 1997; Zuckerman, Ball, & Black, 1990) have positively associated sensation seeking criminal and imprudent behaviours including risky sexual behaviour, illicit drug abuse, alcohol abuse and smoking.

SELF-CONTROL AND CRIME

Lack of self-control has been found to predict a child’s violent behaviour later in life (Buker, 2011). Evidence from criminological, sociological and psychological literature have suggested an association between low-self-control and deviant or criminal behaviour (Gottfredson & Hirschi, 1990; Payne, Higgins, & Blackwell, 2010). In fact, it is believed that poor self-control is the key cause of delinquent and criminal behaviour later in life (Gottfredson & Hirschi, 1990).

COGNITIVE DISTORTIONS AND CRIMINAL BEHAVIOUR

Cognitive Distortions (CDs) are biases or inaccurate ways of conferring or attending meaning on experiences (Barriga, Landau, Stinson, Liau, & Gibbs, 2000). These distortions are represented with “minimisations”, “antisocial attitudes”, “justifications” “criminal thinking style”, “rationalisations”, “Self-Serving Cognitive Distortions (SSCD)” and “Social Cognition” (Walters, 1995; Abel et al., 1989; Murphy, 1990). Social-cognitive theories provide that CD makes individuals to block their moral judgements with a view to justify their avoidance of responsibility for own attitudinal and behavioural problems (Kamaluddin, Shariff, Nurfarliza, Othman, Ismail, & Mat Saat, 2014). Criminology literature has also suggested that CDs may contribute to problematic behavioural and emotional responses, which may ultimately lead to deviant and criminal behaviour (Gendreau, Little, & Goggin, 1996). Elsewhere SSCD has identified as criminogenic and antisocial attitudes that insulate individuals from a negative self-concept or blame (Barriga et al., 2000).

PREDICTORS OF VIOLENT BEHAVIOUR

Psychologists have identified predictors of violence at an early age and grouped them into five: individual including factors, family factors, school factors, peer-related factors, neighbourhood and community factors. Individual factors include physical, medical, aggressiveness, internalizing disorders, attitudes and beliefs favourable to antisocial or deviant behaviours; involvement in anti-social behaviours, concentration problems, hyperactivity, risk taking, and restlessness; early initiation violence behaviours; low resting hear rate and internalizing disorders. Family factors include child maltreatment, parental attitude favouring violence or substance use, parent-child separation, family conflict and poor family bonding, low level of involvement of parent in children development, parental criminality and poor family management practices. School factors include low bonding to school, academic failure, dropping out of school and truancy and frequent school transitions. Peer-related factors include gang membership, delinquent peers, and criminal siblings. Neighbourhood and community factors include community disorganization, poverty, neighbourhood adults engaged in crime, availability of firearms and drugs, exposure to racial prejudice and violence (Shader, 2004).

Violence in your child’s life is predicted by delivery trauma and prenatal. Kandel and Mednick (1991) found an association between delivery and pregnancy complications and violence.

Low resting heart rate indicates an under arousal or fearless temperament, which is believed to predispose a person to violence and aggression (Farrington, 1998). Farrington (1998) found low resting pulse rate to predict violent crime. This suggests that a child with a low resting heart rate, or one who had delivery and pregnancy complications may be predisposed to violent behaviour and crime.

Evidence from meta-analysis confirms a correlation between risk-taking, concentration problems, restlessness, and hyperactivity and later violent behaviours. In a longitudinal study, Klinteberg et al. (1993) found that boys with concentration difficulties and restlessness were more likely than those without these characteristics to be arrested for engaging in criminal activities. Fifteen per cent (15%) of boys with concentration difficulties and restlessness at age 13 were convicted of committing crime at 26. Similarly, Farrington (1989) found that male students with restlessness and concentration problems, including frequent talkativeness, the tendency to fidget and difficulty sitting still, were likely to engage in violence and crime later in life. Academic difficulties were found in children with concentration problems, which also predicted later violence.

Attitudes favourable to violence, anti-social attitudes and beliefs, hostility towards police, and dishonesty have been shown to predict later violence among young males (Williams, 1994). Williams (1994) suggested that intervention programs aimed at helping young males to develop standards and positive beliefs could enable them to reject cheating, rule breaking and minimize the risk of violence.

Studies have associated the involvement of a child in antisocial behaviours, notably drug selling, property destruction, stealing, early sexual intercourse, smoking and self-reported crime, with increased risk of violence (Zingraff, Leiter, Myers, & Johnson, 1993).

Parental criminality has have been found to predict child criminality at later stages of life. Farrington (1989) found boys whose parents were convicted of crime before their 10th birthday to be more likely to engage in violent crimes than boys whose parents did not have a criminal record. Similarly, Baker and Mednick (1984) revealed men aged between 18 and 23 with criminal parents to be 3-8 times more likely than those with non-criminal parents to have been convicted of criminal acts. (Baker & Mednick, 1984)

Neglect, sexual abuse and physical abuse are forms of child maltreatment that have been linked to violent crimes. Evidence shows that neglected or physically abused children are more likely than other children to commit violent acts in their later lives (Smith & Thornberry, 1995; Zingraff et al., 1993).

Studies have linked family management practices namely parental failure to create behavioural expectations for children, poor supervision, inconsistent and severe discipline, and poor monitoring to substance abuses and crime in later life. Supporting this assertion, Wells and Rankin (1988) noted that children from strict parents committed more violent crimes compared to those with permissive parents. Conversely, children whose parents were neither too lax nor too strict were least violent. Similarly, children whose parents inconsistently punished them, ignored the same behaviour or sometimes punished them were more likely than others whose parents consistently punished them to commit violent offenses. Parental harshness and punitiveness were also found to predict later violence. In another study, it was revealed that poor child-rearing; poor parental supervision; authoritarian parenting style, parental disagreement about childrearing, a cruel neglectful or passive parenting attitude, and harsh parental discipline, all predicted children involvement in crime later in life (Farrington, 1995).

It has been suggested that strong parental involvement into child development is a protective factor against crime and violence. On the contrary, less parental involvement and interaction in child development may predict future involvement in crime (Williams, 1994). As revealed by Williams (1994), parent-child involvement and communication at age 14 reduced the self-reported criminal behaviours at age 16.

Research suggests that parent-child separation disrupts parent-child relationships and predicts violent behaviour later in life. Henry et al. (1996) indicated that children with a single parent at age 13 predicted their involvement in crime by age 18.

School factors such as low interest in education, low educational achievement, poor-quality schools, truancy, and dropping out of school contributed to later violent and criminal behaviour (Maguin & Loeber, 1996; Hawkins, Farrington & Catalano, 1998).

Denno (1990) revealed that poor academic achievement in school predicted later crime. It was also revealed that academic failure in school increased one’s risk for later crime and violent behaviour (Maguin, Hawkins, Catalano, Hill, Abbott, & Herrenkohl, 1995).

It was revealed by Farrington (1989) that a child at age 10 growing with delinquent siblings is likely to have later convictions for crime and violence. Similarly, Maguin et al. (1995) confirmed a strong association between later conviction for crime and violence and having delinquent siblings and that that antisocial siblings strongly influence other adolescence siblings. It was also confirmed by Moffitt (1993) that adolescents whose peers did not approve of delinquent behaviour had a low likelihood of committing crime acts. Elsewhere, gang membership is believed to predict later crime (Battin, Hill, Abbott, Catalano, & Hawkins, 1998).

Lastly, neighbourhood and community factors, including community disorganization, low neighbourhood attachment, poverty, the availability of firearms and drugs, frequent media portrayal of violence, exposure to racial prejudice and violence, and norms and laws favourable to violence may predict later violence and crime (Brewer et al., 1995; Sampson & Lauritsen, 1994; Henry et al., 1996).

BIOSOCIAL INTERACTION MODEL

The factors that predispose your child to crime and violence later in life and the causal factors that underlie the problem can be conceptualized using a biosocial model. This model proposes a relationship between predictors of violence and outcome. In this case, biological and psychological risk factors during a child’s prenatal period give rise to factors that predict violence and crime later in life. This suggests that psychological and biological risk factors influence the tendency of committing crime and involvement in violence later in life (Stoff, Breiling, & Maser, 1997).

EARLY BIOLOGICAL RISK FACTORS

The Biosocial Model has biological personality traits as its first component. During the perinatal and prenatal period, these risk factors include both maternal and genetic pathophysiological factors that affect the development and growth of the foetus. These factors include illness during pregnancy, maternal malnutrition, using alcohol and drugs, smoking during pregnancy, birth complications and a genetic predisposition to risks factors from the father and mother. Of importance are the Foetal Alcohol Syndrome and other factors such as Corpus Collosum that leads to the neural maldevelopment of the foetus (Stoff, Breiling, & Maser, 1997). Tobacco use during pregnancy directly affects the central nervous system’s structures while complications during pregnancy may injure the central nervous system of the new-born leading to enhanced maladaptive behaviours and other externalizing behaviours (Orlebeke, Knol, & Verhulst, 1997).

PSYCHOSOCIAL RISK FACTORS

Psychosocial risk factors constitute the second element of the Biosocial Interaction Model. These risk factors are social and psychological in nature and occur during early childhood. These factors can be conceptualized as not biological and include high psychosocial stress, teenage pregnancy, poverty, negative attitude during pregnancy, and psychiatric factors (i.e., alcohol and drug abuse) (Curran, White, & Hansell, 2000).

CONCLUSION

In conclusion, the dynamics of crime and violence are best captured from the developmental approach (adolescence). This approach recognizes the change of behaviour over time. At this time of the child development, your child might experience tumultuous change, which could make them vulnerable; leading to increased means and frequency of expression of risky behaviours, including violence and others.

Aggression, crime, and hyperactivity predict adult violence and crime. Similarly, antisocial behaviour, emotional instability, a lack of self-control among adolescents should be treated as red alerts. Parents need to realize the importance of influence of the family, the school and the community in their child’s life. In fact, the mother’s behaviour and substance addictions during pregnancy have also been identified as risk factors.

For this reason, if you want to avoid breeding future potential criminals, target every stage stage of development – from prenatal to adolescence. Knowing what can negatively impact your child and taking meaningful actions to prevent them for happening is the best thing you can do for your child’s future.

REFERENCES

1.Abel, G. G., Gore, D. K., Holland, C. L., Camp, N., Becker, J. V., & Rathner, J. (1989). The measurement of the cognitive distortions of child molesters. Annals of Sex Research , 2, 135−153.

2.Achenbach, T. M. (1978). The child behavior profile: I. Boys aged 6–11. Journal of Consulting and Clinical Psychology , 46, 478–488.

3.Baker, R. L., & Mednick, B. R. (1984). Influences on Human Development: A Longitudinal Perspective. Boston, MA: Kluwer-Nijhoff.

4.Barriga, A. Q., Landau, J. R., Stinson, B. L., Liau, A. K., & Gibbs, J. C. (2000). Cognitive distortion and problem behaviors in adolescents. Criminal Justice and Behavior , 27, 333–343.

5.Battin, S. R., Hill, K. G., Abbott, R. D., Catalano, R. F., & Hawkins, J. D. (1998). The contribution of gang membership to crime beyond delinquent friends. Criminology , 36, 93–115.

6.Blackburn, R. (1993). The Psychology of Criminal Conduct: Theory, Research and Practice. Chichester: John Wiley.

7.Brewer, D. D., Hawkins, J. D., Catalano, R. F., & Neckerman, H. J. (1995). Preventing serious, violent, and chronic juvenile offending: A review of evaluations of selected strategies in childhood, adolescence, and the community. In J. C. Howell, B. Krisberg, J. D. Hawkins, & J. J. Wilson, Sourcebook on Serious, Violent, and Chronic Juvenile Offenders. Thousand Oaks.

8.Buker, H. (2011). Formation of self-control: Gottfredson and Hirschi’s general theory of crime and beyond. Aggression and Violent Behaviour , 16, 265–276.

9.Cernovsky, Z. Z., O’Reilly, R. L., & Pennington, M. (1997). Sensation Seeking Scales and consumer satisfaction with a substance abuse treatment program. Journal of Clinical Psychology , 53, 779-784.

10.Curran, G. M., White, H. R., & Hansell, S. (2000). Personality, environment, and problem drug use. Journal of Drug Issues , 30, 375–405.

11.Daderman, A. M. (1999). Differences between severely conduct-disordered juvenile males and normal juvenile males: the study of personality traits. Personality and Individual Differences , 26, 827–845.

12.Denno, D. W. (1990). Biology and Violence: From Birth to Adulthood. Cambridge, UK: Cambridge University Press.

13.Farrington, D. P. (1989). Early predictors of adolescent aggression and adult violence. Violence and Victims , 4, 79–100.

14.Farrington, D. P. (1995). Key issues in the integration of motivational and opportunity-reducing crime prevention strategies. In P. O. Wikström, R. V. Clarke, & J. McCord, Integrating Crime Prevention Strategies: Propensity and Opportunity (pp. 333–357). Stockholm, Sweden: National Council for Crime Prevention.

15.Farrington, D. P. (1998). Predictors, causes and correlates of male youth violence. Crime and Justice , 24.

16.Foshee, V., & Bauman, K. E. (1992). Parental and peer characteristics as modifiers of the bond-behaviour relationship: An elaboration of control theory. Journal of Health and Social Behaviour , 33 (1), 66–76.

17.Gendreau, P., Little, T., & Goggin, C. (1996). A meta-analysis of the predictors of adult offender recidivism: What works! Criminology , 34 (4), 575–608.

18.Gottfredson, M. R., & Hirschi, T. A. (1990). General Theory of Crime. Stanford, CA: Stanford University Press.

19.Hawkins, J. D., Arthur, M. W., & Catalano, R. F. (1995). Preventing substance abuse. In Building a Safer Society: Strategic Approaches to Crime Prevention. In M. Tonry, & D. P. Farrington, Crime and Justice: A Review of Research (Vol. 19, pp. 343–427). Chicago, IL: University of Chicago Press.

20.Hawkins, J. D., Farrington, D. P., & Catalano, R. F. (1998). Reducing violence through the schools. In D. S. Elliott, B. A. Hamburg, & K. R. Williams, Violence in American Schools: A New Perspective (pp. 188–216). New York, NY: Cambridge University Press.

21.Heaven, P. (1996). Personality and selfreported crime: Analysis of the “Big Five” personality dimensions. Personality and Individual Differences , 20, 47–54.

22.Henry, B., Avshalom, C., Moffitt, T. E., & Silva, P. A. (1996). Temperamental and familial predictors of violent and non-violent criminal convictions: Age 3 to age 18. Developmental Psychology , 32, 614–623.

23.Kamaluddin, M., Shariff, N. S., Nurfarliza, S., Othman, A., Ismail, K. H., & Mat Saat, G. A. (2014). Psychological traits underlying different killing methods among Malaysian male murderers. Malaysian J Pathol , 36 (1), 41–50.

24.Kandel, E., & Mednick, S. A. (1991). Perinatal complications predict violent offending. Criminology , 29, 519–529.

25.Klinteberg, B. A., Andersson, T., Magnusson, D., & Stattin, H. (1993). Hyperactive behavior in childhood as related to subsequent alcohol problems and violent offending: A longitudinal study of male subjects. Personality and Individual Differences , 15, 381–388.

26.Loeber, R., & Hay, D. F. (1996). Key issues in the development of aggression and violence from childhood to early adulthood. Annual Review of Psychology , 48, 371–410.

27.Maguin, E., & Loeber, R. (1996). Academic performance and crime. In M. Tonry, Crime and Justice: A Review of Research (Vol. 20, pp. 145–264). Chicago, IL: University of Chicago Press.

28.Maguin, E., Hawkins, J. D., Catalano, R. F., Hill, K., Abbott, R., & Herrenkohl, T. (1995). Risk factors measured at three ages for violence at age 17–18. Paper presented at the American Society of Criminology. Boston, MA.

29.McCord, J., & Ensminger, M. (1995). Path-ways from aggressive childhood to criminality. Paper presented at the American Society of Criminology. Boston, MA.

30.Moffitt, T. E. (1993). Adolescent-limited and life-course-persistent antisocial behaviour: A developmental taxonomy. Psychological Review , 100, 674–701.

31.Murphy, W. D. (1990). Assessment and Modification of Cognitive Distortions in Sex Offenders. In W. L. Marshall, D. R. Laws, & H. E. Barbaree, Handbook of Sexual Assault. Applied Clinical Psychology. Boston, MA: Springer.

32.Orlebeke, J. F., Knol, D. L., & Verhulst, F. C. (1997). Increase in child behaviour problems resulting from maternal smoking during pregnancy. Archives of Environmental Health , 52, 317–321.

33.Ou, S., & Reynolds, A. (2010). Childhood Predictors of Young Adult Male Crime. Child Youth Serv , 32 (8), 1097–1107.

34.Payne, B. K., Higgins, G. E., & Blackwell, B. S. (2010). Exploring the link between selfcontrol and partner violence: Bad parenting or general criminals. Journal of Criminal Justice .

35.Robinson, M. B. (2004). Why crime? An Integrated Systems Theory of Antisocial Behaviour. Upper Saddle River, New Jersey: Pearson Prentice Hall.

36.Romero, E., Luengo, M., & Sobral, J. (2001). Angeles Personality and antisocial behaviour: Study of temperamental dimensions. Personality and Individual Differences , 31, 329–348.

37.Sampson, R., & Lauritsen, J. (1994). Violent victimization and offending: Individual-, situationa-, and community-level risk factors. In A. J. Reiss, & J. A. Roth, Understanding and Preventing Violence (Vol. 3, pp. 1-114). Washington, DC: National Academy Press.

38.Shader, M. (2004). Risk Factors for Delinquency: An Overview. Retrieved 2017 from US Department of Justice: https://www.ncjrs.gov/pdffiles1/ojjdp/frd030127.pdf

39.Slot, W. N., & Hoeve, M. (2016). Tomorrow’s Criminals: The Development of Child Delinquency and Effective Interventions. Routledge.

40.Smith, C., & Thornberry, T. P. (1995). The relationship between childhood maltreatment and adolescent involvement in crime. Criminology , 33, 451–481.

41.Stoff, D., Breiling, J., & Maser, J. (1997). Handbook of antisocial behaviour. New York: Wiley.

42.Tolan, P. H., & Thomas, P. (1995). The implications of age of onset for crime risk: II. Longitudinal data. Journal of Abnormal Child Psychology , 23, 157–181.

43.Walters, G. D. (1995). The psychological inventory of criminal thinking styles, Part I: Reliability and preliminary validity. Criminal Justice and Behaviour , 22, 307-325.

44.Wells, L. E., & Rankin, J. H. (1998). Direct parental controls and delinquency. Criminology , 26, 263–285.

45.Williams, J. H. (1994). Understanding sub-stance use, crime involvement, and juvenile justice system involvement among African-American and European-American adolescents. Unpublished dissertation, University of Washington, Seattle, WA.

46.Zingraff, M. T., Leiter, J., Myers, K. A., & Johnson, M. (1993). Child maltreatment and youthful problem behaviour. Criminology , 31, 173–202.

FINANCIAL INTELLIGENCE WITHIN CYBERCRIME

FINANCIAL INTELLIGENCE WITHIN CYBERCRIME

FADI ABU ZUHRI

INTRODUCTION

Financial Fraud or Financial Crime covers a range of criminal acts or offences that extend beyond national borders. These offences are international in nature and executed in Cyberspace (Boorman & Ingves, 2001). They impact financial sectors and international banking. These form of crimes affect organizations, nations, as well as individuals, negatively impact the social and economic system and cause considerable loss of money. These crimes involve Money Laundering, embezzlement, theft, skimming, Money Laundering, Ponzi schemes and phishing to name a few (Assocham, 2015).

These are often committed by organized criminal networks and motivated by prospects of earning huge profits from the activities. Assets are obtained illegally through Financial Fraud. The differences between countries, including differences in national jurisdictions, the level of expertise of different countries’ prosecutorial and investigative authorities make it difficult for law enforcement officers to trace criminals engaging in Financial Fraud (Interpol, 2017).

Financial Intelligence can curb Cybercrime by understanding what motivates Cybercriminals. This paper seeks to understand Cybercrime in the context of three types of Cybercriminals – Petty Thieves (PT), Professional Criminals (PC) and Information Warriors (IW). Rogers (2011) originally proposed a nine model classification of Cybercriminals based on what motivates them. Of these nine types of Cybercriminals, PT, PC and IW’s motivations revolve around Money Laundering and Financial Fraud.

MOTIVATIONS FOR FINANCIAL FRAUD

Petty Thieves (PT) engage in Cybercriminal activities to further their criminal activities (Rogers , 2005). They are less interested in notoriety. Their attraction to the Internet and technology is to follow their traditional targets, which include banks and naïve people. PTs learn and acquire the prerequisite skills that can enable them to perpetrate Cybercrime. This group often possesses a maturation of skills and are largely motivated by greed, revenge and financial gain (Parker, 1998).

Unlike Petty Thieves, Professional Criminals (PC) have larger ambitions and a higher set of technical abilities and skills. Like professional criminals within the traditional criminal domain, PCs are motivated to engage in criminal activities by financial and monetary gains. They seek to gain fame and bragging rights; PCs take pride in accomplishing their criminal tasks. However, due to their sophistication, they are rarely caught or attract the authority’s attention. Individuals belonging to this group are mature developmentally, psychologically and chronologically with a high level of technical acumen. They often work with organized Cybercriminal groups who are adept at using Internet technology in furthering their criminal goals (Rogers & Ogloff, 2004).

The Information Warriors (IW) consists of persons who defend and conduct attacks, which are aimed at destabilizing, affecting or disrupting the integrity of information and data systems that control and command decisions (Sewell, 2004; Rogers, 2005). This group is composed of non-traditional as well as traditional state sponsored technology-based warfare outfits. Individuals belonging to this group are highly skilled, highly trained and motivated by patriotism to engage in Cybercrime.

PC and IW are considered the most dangerous Cybercriminals. They are ex-intelligence operatives and professional criminals and are guns for hire (Post, Shaw, & Ruby, 1998; Post, 1998). These individuals being extremely well trained, specialize in corporate espionage and have the access to necessary state of the art equipment for executing their plans (Denning, 1998).

FINANCIAL INTELLIGENCE TO CURB FINANCIAL FRAUD

It is recognized that Cybercrime can be managed through financial Crime Risk Assessment, which is part of Financial Intelligence. There are three prevailing narratives that support this argument. The first narrative is that information technology is creating new services and products, driving disruptive innovation and destroying and impacting the already established business models. Examples include mainframes giving rise to Internet enabled online banking, ATM and the Web driving the creation of person-to-person payments and banking apps. The second narrative is that organized Cybercrime is moving in tandem with innovations to identify and exploit the vulnerabilities and weaknesses of fraudulent gains. According to Ganaspersad and Shirilele (2015) organized Cybercriminals are using fraudulent topologies that mirror the development of products. These topologies include stolen cheques and cards, increased levels of sophisticated Cyber-attacks, attacks on Internet protocols weaknesses and advanced persistence threats. The common thread linked to these narratives includes the increased speed of change in areas of retail payment and banking, and the ability of Cybercriminals to respond with speed (SIPA, 2011).

The third and swiftly emerging narrative emphasizes the convergence of IT security and fraud risk management to overcome the shortcomings of the traditional model which is characterized by constrained communication, shared understanding and separated functions. This narrative emphasizes the need for change, indicating that the existing risk management framework does not effectively guard institutions from financial loss and attack with resultant damage to the regulatory relationships and reputation. It emphasizes the importance of designing agility into the risk management processes for financial institutions to help facilitate proactive response to criminal and innovation threat (Daws, 2015).

Indeed, in line with these narratives, it is widely recognized that Cybercriminals require financing in order to fund their operations. They require sustainable cash flows to fund their operations. Financial Intelligence emerges from this context. It encompasses methods and means used by actors within the financial industry to reveal, deter and disrupt financing of Cybercriminals. Cybercriminals often engage in a range of financial activities with a view to ensuring that the security agents do not disrupt cash flows. Their modes of operations are diverse. They receive donations from unwitting and complicit sources. Common means of their financing include fraud, counterfeiting, kidnapping and extortion. Some actors engage in security schemes and market-based commodities. All these activities are underpinned by a practice referred to as Money Laundering (American Security Project, 2011).

Money Laundering is a process involving concealing illicitly gained funds and making them appear as funds that was sourced legitimately. These are the funds that international financial institutions and security experts seek to curtail by using Anti-Money Laundering practices and policy. Transactions of any nature or amount made via conventional channels are detected and traceable (American Security Project, 2011). Money Laundering allows Cybercriminals to clean up criminal proceeds and disguise their unlawful and illicit origins (Crown Prosecution, 2002). This is often achieved when Cybercriminals hack the government or an organization’s IT infrastructure by means of various malware. This allows them to track people’s online activities and transactions, obtain passwords and other personal information. This way, they siphon billions of dollars worth of intellectual property, technology and trade secrets from the computer systems of corporations, research institutions and government agencies (Nakashima, 2011).

The use of conventional means combined with extensive cooperation between financial industry, governments and international financial institutions can yield considerable success in deterring Cybercrime by combating Money Laundering. However, it is challenging for financial institutions and governments to detect and disrupt Informal Value Transfer Systems like “hawala”. These systems may not comply with the requirements of formal financial systems, which require firms to track and report Money Laundering activities. These systems are by nature abstract and unregulated. This way, they facilitate secrecy and allow Cybercriminals and other illegitimate actors to exploit Cyberspace with increasing regularity and conduct their financial operations without being detected (Passas, 2003). Restricting the organizations’ ability to access resources is an important component of the broader security strategy.

Financial Intelligence is by nature adaptive and requires a broad range of forensics, network analysis, technology complement by effective and smart policies (Bank of England, 2016). By understanding factors influencing current practices and trends in the area of Financial Intelligence, industry leaders and policymakers will be well-informed and empowered to come up with effective judicious policies and private-public partnerships needed to help secure the global financial system, effectively combat threat finance, and facilitate information-sharing.

SIPA (2011) proposes two trends that can enable firms to overcome Cyber threats: collective intelligence and providing technical and professional services. With regard to collective Financial Intelligence, SIPA (2011) suggests that the evolving and distributed nature of Cyber threats requires financial institutions to create a networked and collaborated defence. Within the Cyber security context, collective intelligence involves sharing information concerning remedies, vulnerabilities and threats between security vendors, the government and enterprises. It can inform Cyber forensics to audit areas of suspected and known weaknesses. It can also reveal areas and trends that warranty investing additional security measures. Vendors are developing shared Financial Intelligence features including anonymously injecting data feeds and aggregated data about email addresses, file names, IP addresses, search strings and query into their security monitoring dashboards with a view to help improve security for their users. As suggested by Ganaspersad and Shirilele (2015), the key aim of Cyber security should be to promote the sharing of vulnerability and Cyber threat information between private sectors and the public.

With regard to technology and Financial Intelligent professional services, it is indicated that it is increasingly becoming difficult for traditional Cyber security products namely antivirus scanners and firewalls to thwart every threat created as a result of security vulnerability brought about by mobile, cloud and social computing. Network security analysers and other tools make it difficult for enterprises to use effectively without specialized Cyber security talent and help from other firms. Professional services companies have introduced security offering that integrate human intelligence and analytical and automation capability of information technology platforms to help users cope. These technology offering enable firms to collect, analyse and monitor large data sets in order to identify patterns that suggest any breaches attempted by Cybercriminals. This allows enterprises to respond with more agility to threats. It also allows firms to thoroughly audit Cyber security risks whenever they are expected to disclose their security incidents and risks. Firms are no longer relying on using passive defences to protect against Cyber attacks. As such, joining analytics and automation to human judgment and tapping into collective Financial Intelligence can enable them to lower costs of mitigating Cyber attacks and reduce risk of such attacks (Bissell, Mahidhar, & Schatsky, 2013).

According to Seddon (2015) Financial Crime Risk Assessment should encompass the following: access rights and controls; data loss prevention; vendor management; training; and incident response plan. Adequate access rights and controls such as implementing multifactor authentication are required to help prevent unauthorized access to Information Systems. This includes reviewing controls associated with customer logins; remote access; tired access; network segmentation and passwords. Data loss prevention involves implementing adequate and effective controls in areas of system configuration and patch management, including monitoring network traffic and the potential transfer of unauthorized data via uploads and email attachments (Ganaspersad & Shirilele, 2015).

Vendor Management encompasses controls and practices aimed at selection and evaluation of external providers. These controls and practices include due diligence in relation to vendor monitoring, selection, and oversight. It also includes how to consider vendor relationships are part of the ongoing risk assessment process of the firm (Seddon, 2015).

There is need for adequate training of vendors and employees with respect to Confidentiality of Customer Information, Customer Security and records. According to Seddon (2015) the training should be tailored to encourage responsible Vendor and Employee Behaviour, and on how to integrate incident response procedures into regular training programs. Incidence response plans includes the assessment of System Vulnerabilities, Assigning Roles, and determining which firm, assets, services, or data warrant protection.

CONCLUSION

In conclusion, Financial Intelligence is a critical area for unravelling financial networks that support these illicit and dangerous Cybercriminal. The efforts to combat Cybercrime must therefore involve a multidisciplinary approach to help understand enabling factors and driving forces of Cybercrime. The cost-benefit and efficacy of existing Anti-Money Laundering practices should also be taken into account.

REFERENCES

1.American Security Project. (2011). Threat Finance and Financial Intelligence. Retrieved 2017 from https://www.americansecurityproject.org/asymmetric-operations/threat-finance-and-financial-intelligence/

2.Assocham. (2015, June). Current fraud trends in the financial sector. Retrieved 2017 from PWC: https://www.pwc.in/assets/pdfs/publications/2015/current-fraud-trends-in-the-financial-sector.pdf

3.Bank of England. (2016). CBEST Intelligence-Led Testing: Understanding Cyber Threat Intelligence Operations. Retrieved 2017 from http://www.bankofengland.co.uk/financialstability/fsc/Documents/cbestthreatintelligenceframework.pdf

4.Bissell, K., Mahidhar, V., & Schatsky, D. (2013, August 13). Fighting Cybercrime with Collective Intelligence and Technology. Retrieved 2017 from The Wall Steet Journal: http://deloitte.wsj.com/riskandcompliance/2013/08/13/fighting-cyber-crime-with-collective-intelligence-and-technology/

5.Boorman, J., & Ingves, S. (2001, Febuary 12). Financial System Abuse, Financial Crime and Money Laundering— Background Paper. Retrieved 2017 from IMF: https://www.imf.org/external/np/ml/2001/eng/021201.pdf

6.Crown Prosecution. (2002). Proceeds Of Crime Act 2002 Part 7 – Money Laundering Offences. Retrieved 2017 from http://www.cps.gov.uk/legal/p_to_r/proceeds_of_crime_money_laundering/

7.Daws, M. (2015, September 18). Fraud risk management and IT security should converge to protect against organized and cyber crime . Retrieved 2017 from http://financeandriskblog.accenture.com/cyber-risk/finance-and-risk/fraud-risk-management-and-it-security-should-converge-to-protect-against-organized-and-cyber-crime

8.Denning, D. (1998). Information Warfare and Security. Reading: Addison-Wesley.

9.Ganaspersad, R., & Shirilele, N. (2015, July 23). Financial Crime Risk Management (FCRM) Policy. Retrieved 2017 from https://www.hollard.co.za/binaries/content/assets/hollardcoza/pages/about-us/legal-requirements/south-africa/annexure-a–hollard-fcrm-policy_2015-final-approved-by-board.pdf

10.Interpol. (2017). Financial crime. Retrieved 2017 from https://www.interpol.int/Crime-areas/Financial-crime/Financial-crime

11.Nakashima, E. (2011, November 6). Warning as US companies lose out through cyber-spies. Retrieved 2017 from https://www.pressreader.com/south-africa/the-sunday-independent/20111106/282475705615821

12.Parker, D. (1998). Fighting computer crime: A new framework for protecting information. New York: John Wiley & Sons, Inc.

13.Passas, N. (2003). Hawala and Other Informal Value Transfer Systems: How to Regulate Them? Risk Management , 5 (2), 49–59.

14.Post, J. (1998). The dangerous information system insider: psychological perspectives. From http://www.infowar.com

15.Post, J., Shaw, E., & Ruby, K. (1998). Information terrorism and the dangerous insider. InfowarCon’98. Washington, DC.

16.Rogers, M. K. (2011). Chapter 14 The Psyche of Cybercriminals: A Psycho-Social Perspective. In S. Ghosh, & E. Turrini, Cybercrimes: A Multidisciplinary Analysis. Springer-Verlag Berlin Heidelberg.

17.Rogers, M. (2005). The development of a meaningful hacker taxonomy: a two dimensional approach. NIJ National Conference 2005. Purdue University.

18.Rogers, M., & Ogloff, J. (2004, Spring). A comparative analysis of Canadian computer and general criminals. Canadian Journal of Police & Security Services , 366-376.

19.Seddon, J. (2015, October 7). Cyber crime – a growing threat to financial institutions. Retrieved 2017 from Cyber crime – a growing threat to financial institutions

20.Sewell, W. (2004). Protecting against Cyber terrorism. Public Works , 135 (3), 39-43.

21.SIPA. (2011, February). Financial Intelligence Department. (2011). Guidelines For Risk Assessment And Implementation Of The Law On Prevention Of Money Laundering And Financing Of Terrorist Activities For Obligors. Retrieved 2017 from Financial Intelligence Department, Bosna i Hercegovina Ministarstvo sigurnosti: http://www.sipa.gov.ba/assets/files/secondary-legislation/smjernicefoo-en.pdf

 

HOW COULD DECEIVING TECHNIQUES BE USED AGAINST CYBER ATTACKS

HOW COULD DECEIVING TECHNIQUES BE USED AGAINST CYBER ATTACKS

FADI ABU ZUHRI

INTRODUCTION

Deception refers to actions that are deliberately performed by the senders to make the receiver have a different belief from what the sender considers to be true so as to disadvantage the actions of the sender. It involves planned actions pursued so as to present false information to the attacker making them advance the action that would lead to defense to the computer system (Spafford, 2016). Deceiving techniques are techniques that falsify the perception of reality. These techniques could be deliberate, accidental, or self-induced. Deliberate deception has been used as a defense to the system especially when the deception is intended to disadvantage the attacker. In most situations deception may include the process of hiding the real, dissimulation, displaying the false and simulation. Some of the deceiving techniques used against cyber attacks include masking, repackaging, dazzling, mimicking, inventing, and decoying (Almeshekah, 2015; Spafford, 2016). This article explores how these deceiving techniques may be used against cyber attacks.

MASKING

In masking, the real is masked by ensuring that the relevant object is undetected and in some cases, blended to form an irrelevant background. A privatized message sent to a group email could have a message written in a white font and white background. There are also cases where a malicious JavaScript is embedded in the form of white space in a benign looking JavaScript (Almeshekah & Spafford, 2014).

Masking has been used in situations where the attackers hide the damaging scripts by having the same text and background colour. Hiding the software and services makes it possible for the user to hide the services being run especially when they notice any suspicious activity.

REPACKAGING

Repackaging technique has a role in hiding reality in a way such that an object may be made to appear different from its real self. An example is a situation where repackaging is used when a cyber-attack is made to appear bold with a friendly and official headline to lure the receiver to open the message. In other cases, a remailer made anonymous could be used in replacing the real identification of the sender and the information using an email message.

Repackaging techniques may be used as a defense mechanism. In some cases, the attacker may use repackaging techniques to deceive a user. For example, the Cross-Site Scripting (XSS) uses repackaging technique when a dangerous post is presented as harmless so as to steal the cookies of users when they access such a post. Repackaging has also been used in Cross-Site Request Forgery (XSRF), where some attacker deceives the user into some attractive web pages that silently probes the user into taking part in unwanted activities. Further, repackaging techniques have been used by some attackers that pose themselves as anti-virus software so as to deceive the users into installing them for them to take control of the user’s machines (Almeshekah & Spafford, 2014).

As a defense mechanism, repackaging may create files called “Honey-files” that may be presented as normal files that act as alarms to the system administrators when they are accessed by attackers. Honey-files may also be used by attackers, where enticing names are targeted to the computer systems and act as a beaconing technique to the user who will access those files.

DAZZLING

Dazzling is a technique that induces confusion such as obfuscation and randomization of the identified elements. The technique aids in hiding what is real by ensuring that the process of identifying the object is less certain through the resulting confusion on the true nature. An example is an encrypted channel which makes use of obfuscation by hiding the message despite the clear sent message. Honey-words proposal is a dazzling scheme used to hide real passwords in a list of several fake passwords providing the attacker different passwords to choose from, where a single password would be the true one. If an attacker uses any of the stolen passwords on the system, an alarm would be generated to alert the administrators of the stolen passwords (Juels & Rivest, 2013).

MIMICKING

Mimicking is a simulation technique known to invent the false by imitating the traits of real and relevant traits of an object. For instance, an attack could be associated with some webpage, which may appear valid and similar to a reputable firm, yet the object is a malicious web page established by an attacker.

To offer defense against attackers, mimicking softwares and services may be applied to a system mimicking the response given to another system. For example, the system would respond as though it is Windows XP, yet it is Windows 7. As such, the resources of the attackers would be wasted as they exploit Windows XP instead of Windows 7 (Murphy, McDonald, & Mills, 2010).

INVENTING

The inventing simulation technique deals with the inventing of the false through the creation of the perception that relevant objects may exist, yet in reality, the real object may not be in existence. Inventing simulation techniques have been used in Honeypots where a Honeypot provides the appearance of a subnet machine having specific addresses, when in fact there isn’t any real IP address.

Honeypots have been used widely in several applications that offer security like in the detection of spam and inhibiting the operations of spamming activities. They have also been used in the analysis of malware and securing of different databases. Today, the use of Honeypots is applied in mobile environments. The two major types of Honeypots are the client and server Honeypots. Client Honeypots have been reported to be vulnerable user agents that influence several servers actively so as to detect a compromised system. When a compromised system is detected, the client service will be able to send information to the server regarding the infected users. The server Honeypots, on the other hand, have no vital information and may be created in a way to appear vulnerable so as to entice the attackers into attacking them. The application of Honeypots in security system has been applied in detection of an attack, prevention of attack, in research, and in the provision of the required response (Almeshekah & Spafford, 2014).

In the detection of attacks, Honeypots are used in mechanisms such as intrusion detection systems, which are more accurate in detection than traditional mechanisms (Almeshekah & Spafford, 2014). Honeypots have the ability to generate minimum logging data since they are not used for daily operations, and their interactions tend to be illicit. Shadow Honeypots, for example, have yielded positive results when they were used in the detection architecture. In their operation, sensors that detect anomaly were placed next to the system where decisions were made on the destination of the given request. Several security systems have attempted to integrate Honeypots into the real system by having suspicious traffic moved to the shadow system for more investigation. Honeypots have also been used in the detection of wide attacks on the system.

Studies on the prevention of cyber-attack indicate that Honeypots are useful since they reduce the speed of the attackers and in some case hindering their activities. Dormant IP is an example of a Honeypot that has been used in slowing down the attacker by interacting with them. A study by Chen et al. (2008) reported that the use of deception toolkits might help confuse the attackers, hindering them from reaching the server, and even sending some risks to the attacker’s side. Honeypots use traps and enticements to offer security to the system. Other studies in the field report the use of Honeypots in offering deterrence. Honeypots offer protection to the system by hindering the attacker from access. The success of Honeypots has resulted in the creation of anti-Honeypots mechanisms that use methods that offer deterrence.

Honeypots are effective in offering a response to the system. The independence gained by the use of Honeypots could be easily analyzed and disconnected after a continuous attack on them. A Honeypot system will end up hindering the system of production. In a forensic analysis, Honeypots are useful in the sense that they preserve the state of the attacker on the system giving room for conducting an analysis of what happened (Almeshekah & Spafford, 2014).

In research, Honeypots are used in looking for new types of malware and analyzing them. Depending on the type of attack, it would be possible to develop a security tool that will help improve the security. For example, Honeypots have been used to offer different security signatures. Some of the tools designed to capture the identity of the computer malware include dionea, which stores the identity of malware. Also, Honeypots offer a deep understanding of the common type of attacks.

DECOYING

Decoying is a simulation technique used to attract attention away from the relevant objects. Decoying has been used in a situation where a webpage is given false yet believable information on some basic systems so as to attract the attention of the user away from the source of the real data. In some cases, Honeypots may make the attacker believe that one system of an organization is vulnerable thus capturing the attention of the attackers (Carroll & Grosu, 2011).

CONCLUSION

Deceiving techniques have been used widely in offering protection against cyber attacks. A single process of deception may have dissimulation and simulation techniques that hide the real but making sure that the false is seen by the attacker. The attacker pattern needs to be analyzed to settle on the specific deceiving technique in use. Application of the deceiving techniques discussed in this article will offer the system defense against attackers.

While finding evidence is key, doing it legally is equally important. It is possible that the use of deceiving techniques to catch a criminal may be considered illegal in certain jurisdictions. For example, an intruder could claim the Honeytrap served as an entrapment. Additionally, privacy issues need to be considered (Yasinsac & Manzano, 2002).

REFERENCES

1.Al Kawasmi, E., Arnautovic, E., & Svetinovic, D. (2015). BitcoinBased Decentralized Carbon Emissions Trading Infrastructure Model. Systems Engineering , 18 (2), 115-130.

2.Almeshekah, J. (2015). Using Deception to Enhance Security: A Taxonomy, Model and Novel Uses. PhD thesis, Purdue University.

3.Almeshekah, M., & Spafford, E. (2014). Using Deceptive Information in Computer Security Defenses. International Journal of Cyber Warfare and Terrorism , 4 (3), 46-58.

4.Baur, A. W., Bühler, J., Bick, M., & Bonorden, C. S. (2015). Cryptocurrencies as a disruption? empirical findings on user adoption and future potential of bitcoin and co. In Conference on e-Business, e-Services and e-Society (pp. 63-80). Springer International Publishing.

5.Burniske, C., & White, A. (2017, January). Bitcoin: Ringing the bell for a new asset class. Retrieved 2017, from Ark Invest: http://research.ark-invest.com/bitcoin-asset-class

6.Carroll, T., & Grosu, D. (2011). A Game Theoretic Investigation of Deception in Network Security. Security and Communication Networks , 4 (10), 1162–1172.

7.Chen, X., Andersen, J., Mao, Z., & Bailey, M. (2008). Towards an Understanding of Anti-Virtualization and Anti-Debugging Behavior in Modern Malware. IEEE International Conference on Dependable Systems and Networks, (pp. 177–186).

8.Clinton, P. (2014, March). Driving a Drug Dealer’s Car. Retrieved 2017, from http://www.government-fleet.com/channel/procurement/article/story/2014/03/driving-a-drug-dealer-s-car.aspx

9.Gao, X., Clark, G. D., & Lindqvist, J. (2016). Of Two Minds, Multiple Addresses, and One Ledger: Characterizing Opinions, Knowledge, and Perceptions of Bitcoin Across Users and Non-Users. Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems, (pp. 1656-1668). Santa Clara, California.

10.Juels, A., & Rivest, R. (2013). Honeywords: Making Password-Cracking Detectable. SIGSAC Conference on Computer & Communications Security (pp. 145–160). ACM.

11.Kostakis, V., & Giotitsas, C. (2014). The (A) political economy of Bitcoin. Communication, Capitalism & Critique. Open Access Journal for a Global Sustainable Information Society , 12 (2), 431-440.

12.Lopez, G. (2016, September 24). You are way more likely to be killed by deer than by sharks, bears, and gators combined. Retrieved 2017, from https://www.vox.com/2016/9/24/13032272/killer-animals-deer-sharks-bears

13.Muncaster, P. (2017, June 15). World’s Largest Bitcoin Exchange Bitfinex Crippled by DDoS. Retrieved 2017, from https://www.infosecurity-magazine.com/news/worlds-largest-bitcoin-exchange/

14.Murphy, S., McDonald, J., & Mills, R. F. (2010). An Application of Deception in Cyberspace: Operating System Obfuscation. 5th International Conference on Information Warfare and Security, (pp. 241–249).

15.Rogojanu, A., & Badea, L. (2014). The issue of competing currencies. Case study–Bitcoin. Theoretical and Applied Economics , 21 (1), 103-114.

16.Spafford, E. (2016). Some musings on cyber security by a cyber-iconoclast-UNH Alvine Lecture. Retrieved June 11, 2017, from https://m.youtube.com/watch?v=LPBlCJ0zEJc

17.Yasinsac, A., & Manzano, Y. (2002). Honeytraps, a network forensic tool. Sixth Multi-Conference on Systemics, Cybernetics and Informatics. Orlando, Florida, USA.

HOW ONE OF YOUR VIRTUAL PERSONA COULD WORTH 500,000,000.00 EURO

HOW ONE OF YOUR VIRTUAL PERSONA COULD WORTH 500,000,000.00 EURO

FADI ABU ZUHRI

INTRODUCTION

The advancement of technology has brought about massive change in the lives of people. These developments have greatly affected how they transact and behave online. Many of the activities that were conducted face-to-face have transformed in the virtual world. More and more people have built comprehensive online profiles for them to shop, bank, and connect with friends to the point that they have created a Virtual Identity or Persona of themselves.

An individual’s Virtual Persona allows them to access their credit status, bank balances, engage in gaming, socializing, dating, blogging, etc. This makes your Virtual Identity of immense value to organizations and people. Your online behaviour indicates your buying patterns; your social and financial status attracts certain people who want to befriend you.

Virtual Persona has real value and certain entities may want access it and impersonate in the virtual world. Data derived from the virtual persona has become a source of profiteering legally and illegally. The widespread proliferation of illegal and unrestricted use of private information necessitates the need for effective online Identity Management to create a safe online environment for ecommerce and Internet usage as a whole (Smedinghoff, 2011).

People need to understand that in the virtual world, their online identities have immense value. Earlier, people stored their identity cards in their wallets. Now these are stored online – whether it is your social, legal or financial profile. This means, your Virtual Identity can potentially be stolen electronically. Even something as harmless as online gaming is subject to the same threats. Games such as “World of Warcraft” are termed Massively Multiplayer Online Role-Playing Game (MMORPG) as it engages a huge number of users. The “World of Warcraft” holds the Guinness World Records for the largest monthly subscribers of 11.6 million (Mitchell, 2009). The other most played MMORPG include Final Fantasy, The Elder Scrolls Online, Guild Wars 2, Blade & Soul, Black Desert Online, RuneScape, EVE Online and Star Wars (IG Critic, 2016). Various Augmented Reality games, Pokémon Go for example, also are gaining popularity. Such virtual communities are not immune to cyber attacks.

This paper explores the subject of Virtual Identity, the risk and opportunities of losing them to cyber theft. It reports on how organizations, legally and illegally, are analysing your Virtual Persona and what it could mean to losing accessing your Virtual Identity. The paper focuses on Virtual Reality (VR), Augmented Reality (AR), Analytical Tools and services available to analyse Virtual Identities.

VIRTUAL REALITY: RISK & OPPORTUNITIES

Virtual Reality (VR) describes the world that exists in our minds when we are interacting online. It is the computer-generated artificial environment that users can interact with (Biocca & Levy, 1995). This artificial environment can be experienced via stimuli as sounds and sights afforded by a computer. Virtual Identities are created in VR and represent users in the video games, chat rooms, virtual common space or any other similar environments. These identities aimed at complementing various virtual spaces and platforms are simply referred to as “Avatars” (Morgan, 2009). An Avatar includes a representative video content or image, a profile, a name, or a “handle” that offers more information about an individual’s Virtual Identities.

People create virtue identities by creating virtual representatives of themselves (Rheingold, 1991). In online games, the individual’s Virtual Identity may be part of their identity but may differ from their own identity. In other spaces such as Basecamp, Virtual Identities may be less creatively oriented and represent the user’s actual physical identity, where the user uses their own image or name for an Avatar (Witmer & Singer, 1998).

These virtual platforms pose special risks to users, as they are hubs for Cybercriminals. This occurs because VR technology is built upon existing platforms (Lanier, 1992). As such, it offers little new attack opportunity. At the highest level, VR is largely a new input and display mechanism added to the traditional devices. The technology is powered with underlying computers (whether a mobile, personal computer or console device) that have not really changed much. However, VR facilitates positional and orientation tracking. Physical body movements are tracked. The comprehensive behaviour tracking can be quantified to understand preferences, divert the user’s attention and even sell things (Rubin, 2016). Perhaps, the risk posed by it is not any greater than any other device or software that the user may add to his or her computer.

Today, the use of VR in gaming provides users with a fantasy world that is disconnected from reality. This way, it offers the opportunity to the identity thieves to attack VR and monetize such attacks via social engineering.

Finally, tracking data on online shopping facilitated through VR may allow Cybercriminals to make dangerous attacks. Online shopping provides users with an entirely different VR experience. It allows users to browse items online and even try these items on the Avatar. Unfortunately, the program used can identify a person’s debit card or credit card and Cybercriminals can capture and sell this information.

A Cybercriminal can also use VR/ AR headsets tracker such as web-coding tricks to find valuable information of the user, monitor mouse clicks and movements and use this data in recreating the user actions in a similar way one could mimic the manual pin entry (Fox, Arena, & Bailenson, 2009).

AUGMENTED REALITY: RISK & OPPORTUNITIES

Augmented Reality (AR) describes a series of technologies (i.e., Head-Mounted Displays (HMDs)) that makes it possible for the real-time mixing of content generated via computer with video display (Azuma R. T., 1997). It is used to integrate virtual information into the physical environment of a person making it possible for them to perceive it as existing in their environment (Janin, Mizell, & Caudell, 1993). Its functioning is based on the techniques that was developed in VR and interacts with the virtual world. AR technologies are defined by the following features: (1) interactive in real-time; (2) combining virtual and real; and (3) registered in 3D (Azuma, Baillot, Behringer, Feiner, Julier, & MacIntyre, 2001). This means that these technologies are registered in 3D and interact in real-time. This ensures accurate registration and tracking to ensure the user obtains a believable image. As such, the three key building blocks of AR systems are real-time rendering, display technology and tracking and registration (de Sa & Churchill, 2012).

New mobile wearable computing applications supporting AR functionality are increasingly become possible with the decrease in size and increase in the power of computers making it possible for users to access online services everywhere and always. This flexibility allows applications that enable users to exploit the surrounding context. This AR presents a powerful User Interface (UI) to context aware computing environments (Mekni & Lemieu, 2013). Currently, AR exists in consumer products including Microsoft’s HoloLens, Google Glass, Apple’s iPhone X, Samsung Pixie and games such as Pokémon Go.

AR devices may be prone to attacks and lead to identity theft. For instance, a Cybercriminal using Social Engineering and 3D models can alter and create fake videos and games. Computer scientists and animators have already succeeded in creating the techniques to take the voice recording of a person and make them say something they didn’t. They can give a person different lip movements and expressions by altering the person’s video. This can be achieved by way of tracking a history of movement of a person in VR. While these fake videos are yet to be perfected on, it demonstrates how accurate 3D models and VR tracking could change things. The individual’s unique identifiers could be their physical or verbal “ticks” or unique movements. If compromised, Cybercriminals can use these personal intricacies to digitally impersonate a user or to socially engineer one’s friends (Shatte, Holdsworth, & Lee, 2014).

AR technology was developed over forty years back. Pokémon Go just made AR mainstream. Cybercriminals see AR as an opportunity to execute their malicious intents, and have already seized the opportunity of the popularity of games and various other applications to execute their malicious intents (Zhou, Duh, & Billinghurst, 2008). They have succeeded in creating Windows ransomware, SMS spam, scareware apps, lockscreen apps and apps for purpose of executing their malicious intents. They use fake Windows-based Pokémon Go Bot to attack the users of Pokémon Go Bot. This Pokémon Go Bot application levels the account of the user with little effort by mimicking the role of a fake Pokémon trainer (Paz, 2016).

People are also exploiting Pokémon Go to spread malware to the AR game via bogus guides (Tynan, 2017). Augmented wearable technology pose a serious risk as images in the field of view of a person could be manipulated. These Cybercriminals essentially substitute real virtual objects with fake virtual objects. These AR Cybercriminals could also reinvent a new version of ransomware, which could be used for malicious purposes. By using this new breed of ransomware, these Cybercriminals could make a Doctor who is using Microsoft HoloLens to lose control of it or to pay ransoms. Cybercriminals can also use AR devices to collect personal health data and biometric data and use it for malicious intentions (Boyajian, 2017).

ANALYTICAL TOOLS AND SERVICES

The online technology has generated huge amount of data from video streaming, social media activities, online game playing and browsing in the Internet. These data are accumulating day by day from various sources, through different methods of inputting via different technologies. These data accumulated are called as “Big Data” which is considered to be broad, fast and voluminous. It is either structured or unstructured, but still useful to derive data sets and subsets to sell and utilize by online and non-online companies for increasing market coverage and profits (Tiwarkhede & Kakde, 2015).

Companies engaging in analytic services record and then sell online profiles like user/ screen names, email addresses, web site addresses, interests, preferences, home addresses, professional history, and the number of friends or followers an online user has. There are also companies who gather and synthesize data on the tweets, posts, comments, likes, shares, and recommendations of the user in his social media accounts (Beckett, 2012).

Analytic service and online data industry is reported to be a $300 billion-a-year industry, employing around 3 million people in the United States alone (Morris & Lavandera, 2012). There are a lot of successful companies that provide analytical services and data brokering. These companies, supposedly, know more about you than Google. The list includes Acxiom, Corelogic, Datalogix, eBureau, ID Analytics, Intelius, PeekYou, Rapleaf, and Recorded Future (Mirani & Nisen, 2014). What they do is look into online personal profiles of the users, gathering information like names, friends, activities and interests of those personal profiles and selling them to end users for advertising, marketing and other legitimate economic activities. Basically, it collects information like contact detail, interests, preferences and demographics, then aggregating those information gathered based on a subset needed or applicable to its clients. Acxiom alone has recorded over a billion dollar in revenue for its analytical services involving 144 million US households (Morris & Lavandera, 2012).

Data brokers are intelligent in gathering data and know how to use it. They take advantage of the vast data available online in order to deliver relevant services to users, suggest products and services that the users might need or subliminally suggesting that they need it. These companies claim that all the information gathered and sold is legal, secure and suitable for the users. Data brokers cater to different customers that can range from small enterprises to large Fortune 500 companies (Morris & Lavandera, 2012).

Data brokers source their information from a variety of places. For example, Facebook, Google and other free apps are collecting your data and selling it to those who are willing to pay for it. And then there are Cybercriminals who steal this information and sell on the dark net.

It is scary to think what damage a cyber attack on data aggregators could do. In September 2017, Equifax reported a massive data breach. Initially reported as affective 143 million people, the estimate was revised to 145.5 million later. Cybercriminals accessed consumer’s highly sensitive personal and financial information including names, birthdates, addresses and credit card numbers (Hackett, 2017).

CONCLUSION

The cost of virtual persona of a user is priced depending on its legality, usage and the purpose of its application. Bank details, credit history and the availability of personal documents like driver’s license are seen as high value. Financial Times has presented a calculator to show what each bit of your personal information is worth (Steel, Locke, Cadman, & Freese, 2013). The more is revealed about your real and virtual behaviour, the more valuable your information is. And consider the fact that this information is constantly traded and resold to multiple buyers. It is not difficult to imagine that over the course of your lifetime (or afterlife) your persona may be worth 500 million Euros.

In almost all of the cases the owner of such personal information does not receive the income, or even a tiny share of it, from the revenues generated by the analytics service providers who sell this to willing buyers. The owner themselves are facing risk of breach in security when their information is leaked to undesirable elements who will use their identity to commit fraudulent and criminal activities, leaving them liable for credit fraud or for the unpaid loan that they did not apply for in the first place. The real owner of the personal data faces the burden of proving his/ her innocence.

AR and VR devices are highly complex and relatively new. They are vulnerable and attractive to Cybercriminals looking for the weakest link. Some argue that Cybersecurity’s weakest link are the organization’s own employees (Banham, 2017). Social engineering, as it is also known, is where Cybercriminals deceive their victims and gain their trust. Once the Cybercriminal gains entry, the best protective software turns useless. Therefore, organizations need to invest in on-going Cybersecurity awareness for their employees.

Does it make sense to blame people who are the value creators in organizations? Shouldn’t technical systems be built for normal people rather than techies building systems for techies?

REFERENCES

1.Azuma, R. T. (1997). A Survey of Augmented Reality. Presence: Teleoperators and Virtual Environments , 6 (4), 355-385.

2.Azuma, R., Baillot, Y., Behringer, R., Feiner, S., Julier, S., & MacIntyre, B. (2001). Recent advances in augmented reality. Computer Graphics and Applications , 21 (6), 34–47.

3.Banham, R. (2017, March 20). The Weakest Link In Your Cyber Defenses? Your Own Employees. Retrieved 2017, from https://www.forbes.com/sites/eycybersecurity/2017/03/20/the-weakest-link-in-your-cyber-defenses-your-own-employees/#7815acac5d51

4.Beckett, L. (2012, November 9). Yes, Companies Are Harvesting – and Selling – Your Facebook Profile. Retrieved 2017, from ProPublica: https://www.propublica.org/article/yes-companies-are-harvesting-and-selling-your-social-media-profiles

5.Bimber, O., Raskar, R., & Inami, M. (2005). Spatial Augmented Reality. Wellesley: AK Peters.

6.Biocca, F., & Levy, M. (1995). Communication applications of Virtual Reality. Hillsdale, NJ: Erlbaum.

7.Boyajian, L. (2017, February 27). The 3 biggest challenges facing Augmented Reality. Retrieved 2017, from Network World: http://www.networkworld.com/article/3174804/mobile-wireless/the-3-biggest-challenges-facing-augmented-reality.html

8.de Sa, M., & Churchill, E. (2012). Mobile augmented reality: exploring design and prototyping techniques. 14th international conference on Human-computer interaction with mobile devices and services (pp. 221–23). ACM.

9.Eskelinen, M. (2001). Towards computer game studies. Digital Creativity , 175–183.

10.Fox, J., Arena, D., & Bailenson, J. N. (2009). Virtual Reality: A Survival Guide for the Social Scientist. Journal of Media Psychology , 95–113.

11.Hackett, R. (2017, October 2). Equifax Underestimated by 2.5 Million the Number of Potential Breach Victims. Retrieved 2017, from http://fortune.com/2017/10/02/equifax-credit-breach-total/

12.IG Critic. (2016). Most Played MMORPG Games of 2016. Retrieved 2017, from http://igcritic.com/blog/2016/03/17/most-played-mmorpg-games-of-2016/

13.Janin, A. L., Mizell, D. W., & Caudell, T. P. (1993). Calibration of head-mounted displays for augmented reality applications. (pp. 246–255). IEEE.

14.Lanier, J. (1992). Virtual reality: The promise of the future. Interactive Learning International , 275–279.

15.Mekni, M., & Lemieu, A. (2013). Augmented Reality: Applications, Challenges and Future Trends. Applied Computational Science .

16.Mirani, L., & Nisen, M. (2014, May 27). The nine companies that know more about you than Google or Facebook. Retrieved 2017, from https://qz.com/213900/the-nine-companies-that-know-more-about-you-than-google-or-facebook/

17.Mitchell, B. (2009, June 5). E3 2009: Guinness World Records announces awards at E3. Retrieved 2017, from http://www.ign.com/articles/2009/06/05/e3-2009-guinnes-world-records-announces-awards-at-e3

18.Morgan, G. (2009, July 24). Challenges of Online Game Development: A Review. Simulation & Gaming. (Sage) Retrieved 2017, from Simulation & Gaming: http://research.ncl.ac.uk/game/research/publications/87445d01.pdf

19.Morris, J., & Lavandera, E. (2012, August 12). Why big companies buy, sell your data. Retrieved 2017, from CNN: http://edition.cnn.com/2012/08/23/tech/web/big-data-acxiom/

20.Paz, R. D. (2016, August 24). Pokémon Go Accounts Targeted by Bogus Pokémon Go Bot. Retrieved 2017, from Fortinet: https://blog.fortinet.com/2016/08/24/pokemon-go-accounts-targeted-by-bogus-pokemon-go-bot

21.Rheingold, H. (1991). Virtual reality. New York: Simon & Schuster.

22.Rubin, P. (2016). AR, VR, MR: Making Sense of Magic Leap and the Future of Reality. Retrieved 2017, from https://www.wired.com/2016/04/magic-leap-vr/

23.Shatte, A., Holdsworth, J., & Lee, I. (2014). Mobile augmented reality based context-aware library management system. Expert Systems with Applications , 41 (5), 2174–2185.

24.Smedinghoff, T. J. (2011). Introduction to Online Identity Management. Colloquium on Electronic Commerce .

25.Steel, E., Locke, C., Cadman, E., & Freese, B. (2013, June 13). How much is your personal data worth? Retrieved 2017, from http://ig.ft.com/how-much-is-your-personal-data-worth/?mhq5j=e5

26.Tiwarkhede, A. A., & Kakde, V. (2015). A Review Paper on Big Data Analytics. International Journal of Science and Research , 845-848.

27.Tynan, D. (2017, June 9). Augmented reality could be next hacker playground. Retrieved 2017, from https://www.the-parallax.com/2017/06/09/augmented-reality-hacker-playground/

28.Witmer, B., & Singer, M. (1998). Measuring presence in virtual environments: A presence questionnaire. PRESENCE: Teleoperators and Virtual Environments. Presence , 7 (3), 225–240.

29.Zhou, F., Duh, B. I., & Billinghurst, M. (2008). Trends in augmented reality tracking, interaction and display: A review often years of ISMAR. 7th IEEE/ACM International Symposium on Mixed and Augmented Reality (pp. 193–202). IEEE Computer Society.

SMARTPHONES AND BIG DATA – THE END OF PRIVACY

SMARTPHONES AND BIG DATA – THE END OF PRIVACY

FADI ABU ZUHRI

INTRODUCTION

Technology is rapidly advancing. The technology that was there ten years ago is not the technology that is there today and it will not be there in ten years to come, as new technologies would have been adopted (Briggs & Thomas, 2015). Smartphone manufacturers have adopted various biometric security measures such as voice recognition, fingerprints, facial recognition and IRIS scanners to protect its users. In the not too distant future, biometric scanners and other new security measures would be commonplace. This article shows how such technological advancements can be creepy, as the safety of users’ information would no longer be guaranteed.

WHAT YOUR SMARTPHONE MIGHT BE REVEALING ABOUT YOU

A smartphone can say so much about a person’s personality including the person’s likes and dislikes, the person’s location, which services are being used and how much time spent on various apps, even the mood can be predicted. The smartphone could in fact trigger services to send the individual targeted advertisements (Tene & Polonetsky, 2013).

A study conducted by the University of Lancaster indicated that the operating system of a smartphone, whether Android or OS can depict the personality of an individual. Apparently, people who used Android phones were found to be more honest and humble than those who used iPhones. Further research indicated that Android phone users were found to be kinder, more open and less extroverted that OS users. They concluded by stating that the smartphone is the most basic level of personalization, which can tell a lot about a user (Shaw, Ellis, Kendrick, Ziegler, & Wiseman, 2016).

The applications that the users download could also tell about their personality traits, where that person is downloading from and the services that the individual is using which allow advertising companies to send targeted ads to that individual. A future with Radio Frequency Identification (RFID) implants offer a wide range of challenges and opportunities with identifying pepole (Rotter, Daskala, & Compano, 2008). It has become more and more apparent that the smartphone is the mini digital version of a user and that is why many users do not like other people using their smartphones. This calls for the use of security measures such as biometric scanners to protect the users.

THE PROS AND CONS OF BIOMETRIC SCANNERS

Over the years, smartphone manufacturers have managed to upgrade these devices with embedded biometric scanners (Mayer-Schönberger & Cukier, 2014). Smartphone manufacturers companies have started adding biometric scanners to protect the users. The biometric scanners are beneficial in that they can identify criminals, understand an individual’s online behavior, and predict the political or religious affiliations of that person (Hubbard, 2008). For instance, when a criminal tries to withdraw funds from a person’s online banking through a smartphone, biometric scanners may be able to detect that there is a change of fingerprints and use mechanisms to protect the user such as locking down of the smartphone to prevent withdrawal of the funds. A biometric scanner could proactively scans for viruses to protect the user of the smartphone (Gilbert, 2009).

However, this has proven to be more creepy than beneficial since the personal information of the users can be compromised if someone can hack the biometric scanner. The biometric scanner stores personal information such as the fingerprints of an individual, individual likes and dislikes, app preferences, physical location, etc. (Lieberoth & Hansen, 2011). The biometric scanner could predict a person’s political or religious affiliations. For example, if political elections registers voters using biometric registration, this information can be linked to the person (Greenberger & Padesky, 2015). It is, therefore, evident that future smartphone with more biometric scanners are creepier as they are in a position to store personal information, identify criminals, understand the online behavior of an individual, and depict his or her political or religious affiliations.

HOW BIG DATA IS MARKING THE END OF PRIVACY

It is being suggested that smartphones will, in future, carry out blood tests, medical scans, and even offer diagnosis by linking with advanced medical profiles and databases. Biosensors would be linked to smartphones, monitor the patient’s vital signs and treatment (Topol, 2016).

Powerful alogorithms that run the in backend and link to your smartphone could help the government fight terrorism or online retailers predict buying patterns. For example, Amazon, through its Kindle application, knows which section of the book is most engaging and which one is not. This information can be used to target the user with other interesting sections or prompt the reader to buy another book. Big data and real-time constant surveillance through our smartphones mark the start of new digital revolutions that can change the way we think and interact in a new world. Big data could even predict our future behavior and possibly implicate us for something we did not even do (Mayer-Schönberger & Cukier, 2014).

CONCLUSION

While the benefits of smartphones and in-built security are much touted, one needs to consider the power they are increasingly being vested with as technology advances. With the emergence of new technologies, smartphone manufacturers can enhance more security measures for the users while at the same time store more personal information (Ferguson, 2015). The personal information that is likely to be kept by a biometric scanner includes an individual’s fingerprints, personality traits, likes and dislikes, political and religious affiliations, geo-location, preferred apps and so forth (Fadiman, 2012).

REFERENCES

  1. Briggs, P., & Thomas, L. (2015). An inclusive, value sensitive design perspective on future identity technologies. ACM Transactions on Computer-Human Interaction (TOCHI) , 22 (5).
  2. Fadiman, A. (2012). The spirit catches you, and you fall: A Hmong child, her American doctors, and the collision of two cultures. Macmillan.
  3. Ferguson, A. G. (2015). Big Data and Predictive Reasonable Suspicion (Vol. 163). University of Pennsylvania Law Review.
  4. Gilbert, D. (2009). Stumbling on happiness. USA: Vintage Books.
  5. Greenberger, D., & Padesky, C. A. (2015). Mind over Mood: Change how you feel by changing the way you think. USA: Guilford Publications.
  6. Hubbard, T. E. (2008). Automatic license plate recognition: an exciting new law enforcement tool with potentially scary consequences. Syracuse Journal of Science & Techlogy Law , 18 (3).
  7. Lieberoth, A., & Hansen, F. A. (2011). Can autobiographical memories create better learning? The case of a scary game. Proceedings of ECGBL. The 5th European Conference on Games Based Learning, (pp. 350-357). Athens, Greece.
  8. Mayer-Schönberger, V., & Cukier, K. (2014). Big data: A revolution that will transform how we live, work, and think. Houghton Mifflin Harcourt.
  9. Rotter, P., Daskala, B., & Compano, R. (2008). RFID implants: Opportunities and challenges for identifying people. IEEE Technology and Society Magazine , 27 (2).
  10. Shaw, H., Ellis, D. A., Kendrick, L.-R., Ziegler, F., & Wiseman, R. (2016). Predicting Smartphone Operating System from Personality and Individual Differences. Cyberpsychology, Behavior, and Social Networking , 19 (12), 727-732.
  11. Tene, O., & Polonetsky, J. (2013). A theory of creepy: technology, privacy, and shifting social norms. Yale Journal of Law and Technology , 16 (1).
  12. Topol, E. (2016). The patient will see you now: the future of medicine is in your hands. Basic Books.

DARKER SIDE OF CRYPTOCURRENCY AND THE ROLE OF DARTH VADER

DARKER SIDE OF CRYPTOCURRENCY AND THE ROLE OF DARTH VADER

FADI ABU ZUHRI

INTRODUCTION

Cryptocurrency is the latest innovation in currency that has been introduced in the world today. It involves the use of digital assets to work as a medium of exchange by the use of cryptography. In 2009, Bitcoin became the first company to decentralize cryptocurrency and ever since, many companies have mushroomed to provide the same services (Gao, Clark, & Lindqvist, 2016).

The introduction of this particular medium of exchange was aimed at reducing the production of currency and makes the whole idea of digital assets come to play. The various reasons that have made this particular medium of exchange grow in popularity is the fact that this particular exchange has the ability to reduce the amount of currency that is circulating in various financial institutions that are present today. Also, the assets that have been transformed to the form of Cryptocurrency are less culpable to be followed up as far as legal matters are concerned.

To take an example, Bitcoins have suddenly become a major topic of financial innovation all over the world. There are various benefits of Bitcoin. For instance, Bitcoin has no transaction costs. It is transparent and very open (VPRO, 2015). The development of this technology also comes up with some hitches despite the fact that more and more people are using it as the most preferred medium of exchange. It is estimated that more than 10 million people hold Bitcoin wallets (Burniske & White, 2017). For these reasons, we take a look at the darker side of Crytocurrency.

BLOCKCHAINS EXPLAINED

A blockchain consists of blocks, each holding some data, linked with other blocks. A block of data, in Bitcoin for example, is limited to one megabyte and there is no limit on the number of transactions that can fit in a block (BitcoinWiki, 2016). A blockchain is open, distributed and ensures that the transactions are immutable and can be verified. This is achieved by unique hashing algorithm, like SHA-256. A hashing algorithm provides a unique sequence of bits to authenticate the integrity of the data. Any change in the data leads to a different hash. The hash of one block gets embedded in the next block and so on (Sims, 2017). While this ensured integrity it did not prevent anyone from replicating the blockchain. This is solved by the use of cryptographic nonce (Rogaway, 2004). A nonce is an arbitrary number added within the block to generate a specific type of hash. This, for example in 2017, is a sequence of seventeen zeros at the beginning of the hash. Recreating this hash sequence is infeasible due to the massive computing power required.

MISUNDERSTANDING CRYTOCURRENCY

A lack of public awareness is a cause of misunderstanding Cryptocurrency. People haven’t had enough time to educate themselves about how these systems works (Baur, Bühler, Bick, & Bonorden, 2015). Cryptocurrency is a very young technology when it comes to dealing with currency. People embraced this new idea but not everyone has evaluated the pros and cons of owning and transacting in Cryptocurrencies. This makes them vulnerable to being conned and even losing their fortune.

RISK AND VOLATILITY

Bitcoin and other Cryptocurrencies are still growing and undergoing various developments. Blockchains use private key (secret key) to access digital currency wallets, trade and transact. Thus, protecting your private key is of utmost importance, as it is irreplaceable if lost or stolen, just like cash. It is estimated that the value of lost Bitcoins is US$ 950 million (Berke, 2017). If Darth Vader lost his private key, his money is forever lost in Internet’s virtual space.

Popularity of Cryptocurrency is also increasing as evidenced by the fact that daily Bitcoin transactional volume is over $200 million (Burniske & White, 2017). There is a great risk that this high demand of this Bitcoins may fail to be satisfied raising skepticism from the client on whether the company can meet the demand of the customers. Also, there is a big risk when it comes to the idea of volatility involving the change in prices of the Bitcoin (Kostakis & Giotitsas, 2014). Currently, the Bitcoin prices change every day due to the events related to the production and trade of Cryptocurrencies. There is a risk because at this infancy stage of Bitcoins, there might be loopholes that have not been discovered and it would be important to discover them and cover them as soon as possible to avoid future catastrophe in the transactions.

Your computing power could be used without your knowledge by Cybercriminals for Cryptocurrency mining operations. Such an attack was reported recently when unpatched Windows 2003 Webservers were infected with modified mining software (Monero). The loss was estimated at more than US$ 63,000 in digital currency (Seals, 2017).

PRONE TO MALWARE ATTACK

Cryptocurrency works on a technology of storing assets on a publicly accessible digital platform that could attract cyber-criminals. They could be a constant attempt to try to come up with malware with an attempt of stealing this money (Kostakis & Giotitsas, 2014). Hackers from all corners of the world always try to come up with ways to break the cipher on which these particular Cryptocurrencies have been encrypted. Bitfinex is the largest US dollar-based Bitcoin exchange in the world. It suffers from the effects of a DDoS attack on its systems. Apart from the attacks against the Cryptocurrency exchanges, DDoS has also attacked the Russian exchange BTC-e. Bitcoin inherits decentralization, which is of advantage but also one of its biggest risks and challenges (Muncaster, 2017).

Also, there is a risk of malware in the form or viruses and even Trojans. The fact that all the transactions involving Cryptocurrencies are conducted via the Internet puts this particular type of exchange at the risk of being attacked by malware on the Internet that might decipher or corrupt information that is present about the Bitcoin. The most probable attack by malware is through ransomware where criminals will intercept the information and demand money in exchange.

DECRIMINALISING CRYTOCURRENCY

It is well known within the law enforcement circles that civilian-type vehicles are a preferred choice to blend in with the crowd and go unnoticed. Vehicles such as Toyota Corolla sedans, Ford F-150 pickups, or Chevrolet Malibu sedans are popular choices by both drug dealers and narcotics officers in the United States (Clinton, 2014). Does that mean an unmarked Toyota Corolla should be suspected an accomplice to a crime?

Anyone who has seen the movie “Jaws” will remember how deadly sharks can be. But in reality, you are more likely to be killed by a deer than by sharks, bears and alligators combined. Statistics show that for every one shark related death on average in the United States, 120 deaths are due to deers, 58 due to flying insects and 28 due to dogs. This is in stark contrast to 0.18 deaths per year by a wolf, or on average one person every 5 years (Lopez, 2016). These examples are just a reminder to how people are quick to dismiss Cryptocurrency because the ransomware perpetrators demanded money in Bitcoins. It almost seems as if the media is bent upon projecting Cryptocurrency as the iconic evil currency, similar to Darth Vader of Star Wars fame.

It is argued that for currency to be a suitable medium of exchange, it should be easily dispersed and easily spent. This is not the case for Cryptocurrencies since it is not easily liquidated and thus it cannot be spent as easily as cash. This limitation makes it hard for the Cryptocurrencies to be popular among those engaged in “dark business” (Rogojanu & Badea, 2014). As much as it is touted that Crytocurrencies work outside the traditional modes of banking, evading detection, law enforcement agencies might have access to tools to keep track of the transactions that take place with Cryptocurrency. These arguments suggest that it is a common misconception that Cryptocurrency is a trading ground for illegal business such as Money Laundering and Drug Trafficking.

CONCLUSION

Cryptocurrency is a revolutionary concept that is sure to disrupt the market. It has come with many advantages over the current medium of exchange. These advantages have led to a lot of people to adopt this technology sometimes without having the full knowledge of how this particular business is conducted (Al Kawasmi, Arnautovic, & Svetinovic, 2015).

This particular technology could be the next big thing as a medium of exchange, but there must be a lot of policy formulation that must follow up to ensure that there is minimal fraud. Also, this technology being at an infancy stage there should be a lot of development that should take place to ensure that this whole system functions well and without any doubt from the clients.

While fiat currencies are backed by the government, Cryptocurrencies are generated on a computer system with no governmental guarantees. While some find comfort in government backing, it also means that the government can print an unlimited amount of fiat currency. Fiat currency, in some cases, is not backed up any physical asset like gold. On one had the value of fiat currencies are subject to regulations, market and political forces; on the other hand, Crytocurrencies are influenced by supply and demand.

Owning a Toyota Corolla is not illegal even though it s popular vehicle for criminals who wish to go unnoticed. Popular culture criminalizes the wolf and the shark for killing one and five people every 5 years. The lovely deer is not criminalized even after it is responsible for 120 deaths a year.

Cryptocurrencies are not without its challenges. While there are voices asking for stronger government regulation, does it not defeat the whole premise of decentralization that Cryptocurrency stands for? Despite the name Cryptocurrency, is it not just another asset class that presents a convenient form of value exchange? Why regulate it anymore than you would any taxable good of value?

Is there a solution to recovering your encrypted assets locked by a private key? Can you be given ownership of your Cryptocurreny wallet if it was lost or stolen? Losing ownership of one’s private key is seen as the ultimate risk with no signs of a viable solution.

REFERENCES

1.Al Kawasmi, E., Arnautovic, E., & Svetinovic, D. (2015). BitcoinBased Decentralized Carbon Emissions Trading Infrastructure Model. Systems Engineering , 18 (2), 115-130.

2.Baur, A. W., Bühler, J., Bick, M., & Bonorden, C. S. (2015). Cryptocurrencies as a disruption? empirical findings on user adoption and future potential of bitcoin and co. In Conference on e-Business, e-Services and e-Society (pp. 63-80). Springer International Publishing.

3.Berke, A. (2017, March 7). How Safe Are Blockchains? It Depends. Retrieved 2017, from https://hbr.org/2017/03/how-safe-are-blockchains-it-depends

4.BitcoinWiki. (2016, April 11). Block size limit controversy. Retrieved 2017, from https://en.bitcoin.it/wiki/Block_size_limit_controversy

5.Burniske, C., & White, A. (2017, January). Bitcoin: Ringing the bell for a new asset class. Retrieved 2017, from Ark Invest: http://research.ark-invest.com/bitcoin-asset-class

6.Clinton, P. (2014, March). Driving a Drug Dealer’s Car. Retrieved 2017, from http://www.government-fleet.com/channel/procurement/article/story/2014/03/driving-a-drug-dealer-s-car.aspx

7.Gao, X., Clark, G. D., & Lindqvist, J. (2016). Of Two Minds, Multiple Addresses, and One Ledger: Characterizing Opinions, Knowledge, and Perceptions of Bitcoin Across Users and Non-Users. Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems, (pp. 1656-1668). Santa Clara, California.

8.Kostakis, V., & Giotitsas, C. (2014). The (A) political economy of Bitcoin. Communication, Capitalism & Critique. Open Access Journal for a Global Sustainable Information Society , 12 (2), 431-440.

9.Lopez, G. (2016, September 24). You are way more likely to be killed by deer than by sharks, bears, and gators combined. Retrieved 2017, from https://www.vox.com/2016/9/24/13032272/killer-animals-deer-sharks-bears

10.Muncaster, P. (2017, June 15). World’s Largest Bitcoin Exchange Bitfinex Crippled by DDoS. Retrieved 2017, from https://www.infosecurity-magazine.com/news/worlds-largest-bitcoin-exchange/

11.Rogaway, P. (2004). Nonce-Based Symmetric Encryption. In B. Roy, & W. Meier (Eds.), Fast Software Encryption. FSE 2004. Lecture Notes in Computer Science (Vol. 3017, pp. 348-358). Berlin, Heidelber: Springer.

12.Rogojanu, A., & Badea, L. (2014). The issue of competing currencies. Case study–Bitcoin. Theoretical and Applied Economics , 21 (1), 103-114.

13.Seals, T. (2017, September 29). Monero-Mining Campaign Takes the Easy Road to Cash Gains. Retrieved 2017, from https://www.infosecurity-magazine.com/news/moneromining-campaign-takes-cash/

14.Sims, G. (2017, September 29). What is a blockchain. Retrieved 2017, from https://www.youtube.com/watch?v=KN-FQR7A6Iw&feature=youtu.be

15.VPRO. (2015, November 1). The Bitcoin Gospel. Retrieved 2017, from https://topdocumentaryfilms.com/bitcoin-gospel/

UNETHICAL RECRUITER PIMP INDULGES JOBSEEKERS WEAKNESS AND WISHES

UNETHICAL RECRUITER PIMP INDULGES JOBSEEKERS WEAKNESS AND WISHES

FADI ABU ZUHRI

INTRODUCTION

Having a decent job that meets a person’s needs and expectations is a vital necessity for any adult. According to Calenda (2016), the need sometimes becomes crucial for individuals who have just completed their education or in dire need for employment. This high demand sometimes leads to desperation which makes a job seeker vulnerable and at the disposal of their potential employees. This issue is very common that is why there are agencies created by laws to handle these cases. It is, therefore, important for us to understand some of these unethical issues and also understand how we can deal with these particular issues.

Various ways will display the wrong approach when it comes to the recruitment process (Shaffer, Bakhshi, & Kim, 2015). Any of the following behaviors are conducted by recruiters, they should be considered unethical and should be reported to ensure that they are not repeated and promote a high level of professionalism.

PHISHING FOR PERSONAL INFORMATION

Fake job interviews are increasingly becoming common. This is one way for fraudsters to fish for personal information and money. Scammers use sophisticated tools and techniques to obtain sensitive personal financial information from prospective candidates. Stealing identity is quite valuable as it can be sold over the “dark net” for money (Williams & Pellecchia, 2017).

There are also instances of people roping in the expertise of job seekers for free under the delusion that they are being interviewed for a fancy job. Job seekers, therefore, need to be vigilant and ask relevant questions on the hiring process. They need to be beware of undue appreciation and for people who do not paint a consistent picture of the job profile (Ryan, 2017).

USING EXPLODING JOB OFFERS

Exploding job offers are job offers with a short expiry date. The regular period that has been set for the period between the announcement and the date of recruitment is not less than two weeks, anything less than this is considered to be an exploding job offer. Exploding job offers are not encouraged since they put pressure on the candidate to beat the deadline and make the necessary arrangements for the job (Shaffer, Bakhshi, Dutka, & Phillips, 2016). Ample time is required to be given to a candidate of any interview so that they can gather all the materials that are needed for the recruitment, gather the materials that they think will assist them to secure the job and also prepare themselves psychologically. It is considered unethical if one will be informed of the availability of a job opportunity less than two weeks before the day of the interview, this will be practically preparing them to fail.

TYING BONUS SIGNING TO EXPLODING JOB OFFERS

The issue of tying of bonus signing comes with a lot of controversies given that these unique gifts come with certain terms and conditions (Calenda, 2016). It is, therefore, prudent for someone to understand these particular conditions before he or she signs the given bonus to avoid situations where they get caught up in compromising situations when they are to leave the company.

USING HIGH-PRESSURE INTERVIEW TACTICS

Some approaches during interviews tend to scare away the candidates and lower their confidence in front of the panel. Interviews are meant to gauge the candidates’ knowledge and ability to carry out the various tasks that are required of them (de Silva, Opatha, & Gamage, 2016). There are incidents where the recruiters will use techniques that are meant to scare the candidates from attending the interview or even reduce the number of the people so that they can deal with just a few candidates. These tactics of scaring away the candidates are not encouraged given that it denies candidates a chance to express themselves in a more comfortable environment. An example of these tactics involves asking the candidate irrelevant questions for example; what your worst experience is? These questions are supposed to be asked in an informal context. Asking these questions leaves the candidate confused on how to respond to the questions since they find it awkward.

REVOKING ON A JOB OFFER TO A CANDIDATE

Revoking a job offer to a candidate is something that should highly be avoided. Job offers are open to all candidates provided that they meet the requirement of the job as indicated in the announcement. There are cases when recruiters just decide to revoke the offers to some candidates without any solid grounds for this particular action. This is considered to be unethical because if an individual has met the entire requirement that he or she is expected to have attained, then it is very rightful of them to claim the job.

WITHHOLDING RELEVANT INFORMATION IN EXTENDING A JOB OFFER

When announcing for a particular interview, it is important for the recruiters always to disclose all the information about the jobs that are being offered. Information such as job salary, relocation allowance, starting date and job title are very vital, and it is important that these particular items are included (Jeske & Shultz, 2016). To understand the importance of including this information, let us consider the issue of salary. What motivates people to go for particular work is the salary; the salary usually goes hand in hand with experience and the level of knowledge and education. Including information about the salary minimizes the cases of underpayment.

EXHIBITING UNPROFESSIONAL BEHAVIOR

The recruitment process of employees should be done by any given organization or companies with maximum consideration to professionalism just like any other formal activity of the organization. Any personal behavior that depicts unprofessional conduct in any recruitment process is highly discouraged. There are various forms of unethical conduct including fraternizing and harassment. These two particular acts may be different in this context but should not be entertained.

Fraternizing involves associating with the candidates in a personal and friendly manner. It is not expected that the recruiting panel starts treating the candidates in a very friendly manner that will suggest any form of favors. This particular act of fraternizing might compromise the whole process of recruitment leading to the recruiting of incompetent employees. Another unprofessional conduct during the process of recruitment is harassment (Shaffer et al., 2016). Harassment of candidates includes all forms of harassments including sexual, personal, racial, age and even bullying. All these acts do not provide a comfortable environment for the candidates and are highly discouraged.

CONCLUSION

An unethical recruiter is a pimp who advertises and indulges jobseekers weakness and/ or wishes to gain something at the expense of the job seeker. Although certain activities are not criminalized in the real world, it is pretty close to human slavery as the jobseeker is victimised by being coerced into taking up a job offer that is not worth it.

It is so unjust and unethical for recruiters to take advantage of those who step at their doorstep in search of a livelihood (Wong & Li, 2015). This issue has undermined the professionalism of the company that is recruiting its employees, and it may lead to employing of incompetent people. So, this issue is considered misconduct and those who are victims should report these cases to ensure that the perpetrators face the law.

REFERENCES

  1. Calenda, D. (2016). Sustainable recruitment’of foreign-educated nurses: ethical and work related issues. The case of Finland. Robert Schuman Centre for Advanced Studies Research Paper .
  2. de Silva, V. A., Opatha, H. H., & Gamage, A. S. (2016). Towards Extending the Ethical Dimension of Human Resource Management. International Business Research , 9 (10), 151.
  3. Jeske, D., & Shultz, K. S. (2016). Using social media content for screening in recruitment and selection: pros and cons. Work, Employment & Society , 30 (3), 535-546.
  4. Ryan, L. (2017, August 8). Ten Signs You’re Interviewing For A Fake Job Opportunity. Retrieved 2017, from https://www.forbes.com/sites/lizryan/2017/08/08/ten-signs-youre-interviewing-for-a-fake-job-opportunity/#43cbd2987aaa
  5. Shaffer, F. A., Bakhshi, M., & Kim, E. M. (2015). Business Case for Ethical Recruitment. Nurse Leader , 13 (5), 40-48.
  6. Shaffer, F. A., Bakhshi, M., Dutka, J. T., & Phillips, J. (2016). Code for ethical international recruitment practices: the CGFNS Alliance case study. Human Resources for Health , 14 (31).
  7. Williams , A., & Pellecchia, R. (2017, July 6). Fake Online Job Interviews Phishing for Your Personal Information. Retrieved 2017, from Financial Industry Regulatory Authority: http://www.finra.org/newsroom/2017/fake-online-job-interviews-phishing-your-personal-information
  8. Wong, S. C., & Li, J. S. (2015). Will hotel employees’ perception of unethical managerial behavior affect their job satisfaction? A study of Chinese hotel employees in China. International Journal of Contemporary Hospitality Management , 27 (5), 853-877.

CYBER BODY LANGUAGE

CYBER BODY LANGUAGE

FADI ABU ZUHRI

INTRODUCTION

For several hundreds of years, official agencies have been studying techniques and mechanisms to identify individuals. They started off by passports and identity cards and later developed to more controversial schemes like DNA profiling and body surveillance (Caplan & Torpey, 2001).

It is estimated that there are 39 million web servers worldwide that host 3 billion indexable web pages with 20 billion links. There is an ever increasing surveillance by government as well as telecom operators at the cost of privacy of netizens (Batty, 2003). Technological advances in identity and behaviour mapping have become more daring in the recent times. The handheld mobile phones and other gadgets have made it possible for businesses get to know about the behaviour of the people and allow them to gather vital information that can help them reach out to these users. Phone manufacturers, software developers and internet search engines are now able to detect the behaviour and interests of the users through integrated algorithms and computing devices.

Cyber Body Language is best understood as “Context-Awareness” where a device or software is designed, primarily or partly, to analyse the behaviour or pattern of the users and apply information gathered to automatically assert products, services, or other purposes such as security monitoring.

This article covers the implications of Cyber Body Language’s Context-Awareness and how it will affect the users in terms of privacy, finances and consumption. The review of related literature discusses Cyber Body Language, Context-Awareness, Context-Awareness Computing, Privacy, Geolocations and Targeted Ads through personalized hypermedia application.

CYBER BODY LANGUAGE

According to Oracle (2014), Cyber Body Language or “Digital Body Language” is similar to facial expression or behaviour a user makes when interacting in the cyber world. In an online equivalent, these behaviours and expressions could be web browsing history, download history, web searches and online communication. This behaviour is the raw data that provides informaton about the user’s interests, needs and so on. Even the schedule of the user’s online presence can be useful information for the organizations monitoring the user’s behaviour (Oracle, 2014).

The transformative shift of physical activities such as online shopping transactions had created a marketing challenge of comprehending online consumer behaviour (Woods, 2009). Oracle (2014) stated that marketing and sales operations need to be adapted to ensure that it is Context-Aware or able to comprehend the Cyber Body Language of the consumers. It is imperative that the organization must first have a broad understanding of the impact of the shift and how all the processes came to change with it. An organization must be well-equipped with the necessary technology and infrastructure to be able to synthesize the information based on the consumer behaviour. (Oracle, 2014)

CONTEXT-AWARENESS

Dey (2001) defined context as any data that can be utilized to describe the environment of an entity. According to him, an entity can be the user, location or a thing that is significant in the domain of the application or software (Dey, 2001). On the other hand, Context-Awareness is defined as someone who is the user of the information. In such as case, a system is said to be Context-Aware when it has the ability to gather and synthesize the context information and apply it in the improvement and adaptability of the device (Byun & Cheverst, 2004).

Context-Awareness is aimed to provide efficiency and usability of service offered to the users and this is only possible through being flexible and aware of the changing behaviors of the users (Bolchini, Schreiber, & Tanca, 2007; Dey, 2001; Zhu, Mutka, & Ni, 2005). It has been said that context played a very crucial role because it is built up from user information and included data on status, location and interests (Korpipää, Mäntyjärvi, Kela, Keränen, & Malm, 2003; Kwon, 2004).

CONTEXT-AWARENESS COMPUTING

In understanding Cyber Body Language, there were Context-Aware Systems developed that take advantage of user behaviour. Context-Aware Systems gather context, analyse such context gathered and then with the information acquired is used to customize the system based on the behaviour or changing situation of the user (Khattak et al., 2014).

Facebook plans to figure out the emotional state of the users. It files a stir of patents that try to find out our emotions. One of the patents is Augmenting Text Messages with Emotion Information which involves decorating the text messages to fit the people’s moods. Therefore, Facebook intends to join some features with words to show the impressions of the sender (Vaas, 2017).

The other proposed Emotion-Reading patent is Techniques for Emotion Detection and Content Delivery. It plans to own its path to the cameras on our phones, tablets, and laptops by observing us as we peer at the screens. Another Emotion-Gleaning technology has been described where one will generate emojis based on the user’s facial Expression. These types of technology tools can be used by the marketers to gauge the reaction of the consumers and cater to them (Vaas, 2017).

In short, Context-Aware Systems are made to adapt their systems in accordance to the context of the user without their active participation in such changes (Khattak, et al., 2014). The development of these Context-Aware Systems synthesizes the behaviour and environment of the user with an aim to ensure that such systems will continually be usable and effective throughout time (Baldauf, Dustdar, & Rosenberg, 2007; Khattak, et al., 2011; Chen, Nugent, & Wang, 2012).

Context-Aware Systems are becoming more popular and have been developed into diverse domains or interface such as Location-Based Systems (Want, Hopper, Falcão, & Gibbons, 1992), Context-Aware file system (Hess & Campbell, 2003), Context-Aware Security (Covington, Fogla, Zhiyuan, & Ahamad, 2002), Context-Aware Activity Recognition (Khattak, et al., 2011), Context Based Searching (Ding, et al., 2004; Khattak, Ahmad, Mustafa, Pervez, Latif, & Lee, 2013), and Intelligent Healthcare Systems (Khattak, Ahmad, Mustafa, Pervez, Latif, & Lee, 2013; Khattak, Pervez, Lee, & Lee, 2011; Hussain, et al., 2013; Khattak, Pervez, Han, Lee, & Nugent, 2012). Nowadays, the use of Context-Aware Systems has become commonplace and part of everyday life for users of the cyber world. In fact, Cyber Behaviour sensing and computing devices are known to have been already installed in most smart devices (Khattak, Ahmad, Mustafa, Pervez, Latif, & Lee, 2013; Khattak, Pervez, Lee, & Lee, 2011; Han, Vinh, Lee, & Lee, 2012).

The context gathered from the users is classified as internal or external (Hofer, Schwinger, Pichler, Leonhartsberger, & Altmann, 2013). But the quality of information derived by the Context-Aware Systems is not dependent on whether it is internal or external. Such systems are designed to acquire and synthesize context in order to make it useful and effective for further processing (Baldauf, Dustdar, & Rosenberg, 2007; Han, Vinh, Lee, & Lee, 2012).

Another domain of Context-Awareness is a personalised hypermedia application. It is a hypermedia system which, like any Context-Aware Systems, applies the information, structure and the physical attributes of the networked hypermedia objects to the user’s environment, characterization and behaviour. This Context-Aware domain is considered as an interactive system. This means that users are allowed to navigate a network of linked hypermedia objects. Examples of hypermedia are the web pages which contain various media types like text, photos, videos, clips, applications and other similar elements. (Kobsa, Koenemann, & Pohl, 2001)

PRIVACY

User behaviour in the internet has become subject to breach of privacy and security. Smith et al. (1996) enumerated the four instances where the issue of privacy concerns arise, to wit: the gathering of personal information, unapproved indirect use of personal information, supplying of wrong personal information, and unauthorized access to personal data (Stewart & Segars, 2002). These concerns in online marketing are being applied in the same regards like collection of the personal information, storage and control of these information and observance of the privacy practices and use such data in a way that promotes marketing without breaching the sensitive line of privacy (Malhotra, Kim, & Agarwal, 2004). On the other hand, most consumers are concerned on the unapproved indirect use of data and the supplying of wrong personal information (Brown & Muchira, 2004). There will be a possibility that the consumer may lose his trust to the vendor when the latter insisted on getting the information evoking privacy concerns (Camp, 2003).

Google and Microsoft argue that it has the right to scan all emails passing through its systems. This means that Google can read keywords that can trigger relevant advertisements (Schofield, 2013). Facebook has a privacy setting to allow users to stop the collection of behavioural information. However by default this is set to allow the collection of private information. Even if one were to opt out, it does not stop advertisements on Facebook (Smith L. , 2016).

There are various instances that are possible to happen in terms of breaching of privacy with the utilization of Cyber Body Language . Context-Aware Systems are made smart and adaptable, mostly users are caught off guard, but their behavioural patterns are already studied in the furtherance of the systems they use. Most of the time, this Context-Aware devices are useful, but unauthorized access or misused of the data gathered from the user might post a security threat. Although there may be concerns that Context-Aware Systems can be very damaging to the privacy of the user, it should also consider that these Context-Aware Systems can also provide security. This way, the Context-Aware Systems can intelligently analyse the behaviour of the user, assess the possible breach of security and synthesize those information to strengthen the security systems.

According to Milne and Gordon (1993), the collection of such Personal Information called for the proper treatment as it is considered to be an “Implied Social Contract” with the consumer. The consumer has a right to sue and be entitled with compensation if there such an instance where his trust has been breached by the vendor (Solove, 2006). Because of this, the vendor is always required to ensure that he observed fair information practices to guarantee the consumer that his personal information is well-respected and well-preserved (Culnan, 2000; Dinev & Hart, 2006).

GEOLOCATIONS

One of the domains of Context-Aware Services popularly applied is the location-based services. These services are usually present in mobile services that follow the location of their users (Rao & Minakakis, 2003) which basically the primary market of the Context-Awareness. One location-based services application widely used is the Geo-Fencing and also its allied services like a notification signal wherein it reminds user when it enters a certain area like a nearby police station or school grounds. (Namiot, 2013)

According to Rivero-Rodriguez et al. (2016), there can be issues or problems can arise from the inability to secure location privacy in an Location-Based Context-Aware environment One of the issues in location-enabled aware device is the spamming where the user is barraged by advertisements of the products or services from businesses. The second issue is the threat to personal safety of the user where he can be easily targeted of harassment, assault or any crime because his location is easily traced. The last issue is the ability of other users to access the spatio temporal information of a user where their Privacy, Personal Information, Religious and Political views are located. (Rivero-Rodriguez, Pileggi, & Nykänen, 2016)

TARGETED ADS

Advertisements are targeted to users that meet certain behavioural characteristics. An example of this is the tool created by Cambridge University called “Apply Magic Sauce” which is said to predict the Psycho-Demographic profile of the user based on the footprints left on the social media like Twitter and Facebook. This is developed to give specific perception on the behaviour, personality, attitude, interest and level of interest of the user (Psychometric Centre of University of Cambridge, 2017).

Another tool called “Crystal” is also created to predict the profile of a user by analyzing the email history and LinkedIn profile of a user. This tool can also be used against the email contacts to analyse their behaviour for the user will have a perception of his contact’s behaviour or character. The main objective of this tool is for the user to become a good communicator (Crystal Project Inc., 2017).

CONCLUSION

The use of Cyber Body Language is a result of the evolutionary process of computing systems were user’s patterns and behaviors are studied to become the trigger points for enhancement, upgradation or replacement of systems installed. This adaptability mechanism of devices has been developed really well to read Cyber Body Languages that it became a source of concern for all. Since most users had already experienced how it can exploit, harass, bombard or sneak into their personal space where security and privacy is at great risk. However, it cannot be discounted that the utilization of Cyber Body Languages is a mine-field for discoveries that can help continuously upgrade and advance technologies without explicit participation from the users. Keeping in mind the age-old respect of one’s privacy and personal space, it is only logical to suggest that the development of Cyber Body Languages should be regulated.

REFERENCES

  1. Baldauf, M., Dustdar, S., & Rosenberg, F. (2007). A Survey on context-aware systems. Int. J. Ad Hoc Ubiquitous Comput , 263-277.
  2. Batty, M. (2003). The Next Big Thing: Surveillance from the Ground up. Environment and Planning B: Urban Analytics and City Science , 30 (3).
  3. Bolchini, C., Schreiber, F. A., & Tanca, L. (2007). A methodology for a very small database designs. Information Systems , 61-82.
  4. Brown, M., & Muchira, R. (2004). Investigating the Relationship between Internet Privacy Concerns and Online Purchase Behavior. Journal of Electronic Commerce Research , 62-70.
  5. Camp, L. J. (2003). Design for trust. In R. Falcone, Trust, Reputation and Security: Theories and Practice,. Springer-Verlang.
  6. Caplan, J., & Torpey, J. (2001). Documenting Individual Identity: The Development of State Practices in the Modern World. Princeton, NJ: Princeton University Press.
  7. Chen, L., Nugent, C., & Wang, H. (2012). A knowledge-driven approach to activity recognition in smart homes. IEEE Transactions on Knowledge and Data Engineering , 961–974.
  8. Covington, M., Fogla, P., Zhiyuan, Z., & Ahamad, M. (2002). A context-aware security architecture for emerging applications. 18th Annual Computer Security Applications Conference, (pp. 249-258). Las Vegas, NV.
  9. Crystal Project Inc. (2017). Crystal. Retrieved May 15, 2017 from Crystal Knows: https://www.crystalknows.com/
  10. Culnan, M. J. (2000). Protecting Privacy Online: Is Self-Regulation Working? . Journal of Public Policy and Marketing , 20-26.
  11. Dey, A. K. (2001). Understanding and using context. Personal and Ubiquitous Computing , 4-7.
  12. Dinev, T., & Hart, P. (2006). An Extended Privacy Calculus Model for E-Commerce Transactions. Information Systems Research , 61-80.
  13. Ding, L., Finin, T., Joshi, A., Pan, R., Scott Cost, R., Peng, Y., et al. (2004). Swoogle: A search and metadata engine for the semantic web. 13th ACM International Conference on Information and Knowledge Management, , (pp. 8-13). Washington, DC.
  14. Han, M., Vinh, L., Lee, Y., & Lee, S. (2012). Comprehensive context recognizer based on multimodal sensors in a smartphone. Sensors , 12588–12605.
  15. Hess, C., & Campbell, R. (2003). An application of a context-aware file system. Personal and Ubiquitous Computing , 339–352.
  16. Hofer, T., Schwinger, W., Pichler, M., Leonhartsberger, G., & Altmann, J. (2013). Context-awareness on mobile devices-The hydrogen approach. 36th Annual Hawaii International Conference on System Sciences, (pp. 6-9). Big Island, HI, USA.
  17. Hussain, M., Khattak, A., Khan, W., Fatima, I., Amin, M., Pervez, Z., et al. (2013). Cloud-based Smart CDSS for chronic diseases. Health Technology , 153-175.
  18. Khattak, A. M., Akbar, N., Aazam, M., Ali, T., Khan, A. M., Jeon, S., et al. (2014). Context Representation and Fusion: Advancements and Opportunities. Sensors , 9628–9668.
  19. Khattak, A., Ahmad, N., Mustafa, J., Pervez, Z., Latif, K., & Lee, S. (2013). Context-aware Search in Dynamic Repositories of Digital Documents. 16th IEEE International Conference on Computational Science and Engineering (CSE 2013), (pp. 3-5). Sydney, Australia.
  20. Khattak, A., Pervez, Z., Han, M., Lee, S., & Nugent, C. (2012). DDSS: Dynamic decision support system for elderly. 25th IEEE International Symposium on Computer-Based Medical Systems (CBMS 2012), (pp. 20-22). Rome, Italy.
  21. Khattak, A., Pervez, Z., Lee, S., & Lee, Y. (2011). Intelligent healthcare service provisioning using ontology with low-level sensory data. KSII Transactions on Internet and Information Systems , 2016–2034.
  22. Khattak, A., Truc, P., Hung, L., Vinh, L., Dang, V., Guan, D., et al. (2011). Towards smart homes using low level sensory data. Sensors , 11581–11604.
  23. Kobsa, A., Koenemann, J., & Pohl, W. (2001). Personalised hypermedia presentation techniques for improving online customer relationships. The Knowledge Engineering Review , 111-155.
  24. Korpipää, P., Mäntyjärvi, J., Kela, J., Keränen, H., & Malm, E. J. (2003). Managing context information in mobile devices. IEEE Pervasive Computing , 42-51.
  25. Kwon, O. B. (2004). Modeling and generating context-aware agent-based applications with amended colored petri nets. Expert Systems with Applications , 609-621.
  26. Malhotra, N., Kim, S. S., & Agarwal, J. (2004). Internet Users’ Information Privacy Concerns (IUIPC): The Construct, the Scale, and a Causal Model. Information Systems Research , 336-355.
  27. Milne, G. R., & Gordon, M. E. (1993). Direct mail privacy-efficiency trade-offs within an implied social contract framework. Journal of Public Policy Marketing , 206–215.
  28. Namiot, D. (2013). GeoFence Services. International Journal of Open Information Technologies , 30-33.
  29. Oracle. (2014). Digital Body Language: Reading and Responding to Online Digital Body Behaviors. Digital Body Language Guide .
  30. Psychometric Centre of University of Cambridge. (2017). Facebook and Twitter Prediction. Retrieved May 15, 2017 from Psychometric Centre of University of Cambridge: https://applymagicsauce.com/demo.html
  31. Rao, B., & Minakakis, L. (2003). Evolution of Mobile Location-based Services. Commun. ACM , 61-65.
  32. Rivero-Rodriguez, A., Pileggi, P., & Nykänen, O. A. (2016). Mobile Context-Aware Systems: Technologies Resources and Applications. International Journal of Interactive Mobile Technologies , 25-32.
  33. Schofield, J. (2013, August 15). Is Gmail secure enough for my private emails? Retrieved 2017 from https://www.theguardian.com/technology/askjack/2013/aug/15/gmail-google-email-privacy
  34. Smith, H. J., Milberg, S., & Burke, S. (1996). Information privacy: Measuring individuals’ concerns about organizational practices. MIS Quarterly , 167-196.
  35. Smith, L. (2016, June 3). You Need to Update Your Facebook Privacy Settings — Again. Retrieved 2017 from http://www.goodhousekeeping.com/life/news/a38801/targeted-facebook-ads-privacy-settings/
  36. Solove, D. J. (2006). A Taxonomy of Privacy. University of Pennsylvania Law Review , 477.
  37. Stewart, K. A., & Segars, A. H. (2002). An empirical examination of the concern for information privacy instrument. Information Systems Research , 36-49.
  38. Vaas, L. (2017, June 12). Facebook wants to feel your pain (and your joy). Retrieved 2017 from https://nakedsecurity.sophos.com/2017/06/12/facebook-wants-to-feel-your-pain-and-your-joy/?utm_source=feedburner&utm_medium=feed&utm_campaign=Feed%3A+nakedsecurity+%28Naked+Security+-+Sophos%29
  39. Want, R., Hopper, A., Falcão, V., & Gibbons, J. (1992). The active badge location system. ACM Transactions on Information Systems , 91-102.
  40. Woods, S. (2009). Digital Body Language: Deciphering Customer Intentions in an Online World. Danville, CA: New Year Publishing.
  41. Zhu, F., Mutka, M. W., & Ni, L. M. (2005). Service discovery in pervasive computing environments. IEEE Pervasive Computing , 81-90.

 

CHALLENGES FACED BY CYBER FORENSIC INVESTIGATOR – CONCEPTS AND TECHNIQUES

CHALLENGES FACED BY CYBER FORENSIC INVESTIGATOR – CONCEPTS AND TECHNIQUES

FADI ABU ZUHRI

INTRODUCTION

This paper looks at the techniques and tools used by Cyber Forensic Investigators in various scenarios that prove to be quite challenging. Cyber Forensic Investigators are tasked with presenting digital evidence to the courts. The courts would only accept evidence that is based on reliable principles and methods. One therefore needs to have a way to distinguish reliable techniques from unreliable ones. For example, certain groups consider evidence from astronomy reliable while evidence from astrology is not considered reliable even though they both use the same tools – star charts, planetary positions, telescopes, etc. Cyber Forensic techniques and tools need to be evaluated for reliability before presenting to the courts.

LIVE FORENSICS

Live forensic is mostly applied when the item under investigation is rather too large to be represented practically by imaging (Karie & Venter, 2015). Also, there are situations where the system that is to be investigated is too big to be broken down for postmortem. There also occurs a situation where the computer that is to be investigated is very far away from the Cyber Forensic Investigator. This entire situation will have required the technique of live forensics to be applied. However, the whole case does not mean that one would have to download all these details from a remote location since this will require a more sophisticated network to perform this operation (Christopher, 2006). Additionally, there are cases where the aspect of capture cannot be used for the purpose of postmortem analysis for example memory contents, open ports and other operating aspects of a running computer. In this case, it is advisable that one should use court tested methods to avoid a situation where you will be required to prove the viability of the method in question. According to Peter (2005), the most used situation where the assistance of live forensics is required is in the cases of digital forensic incidence response where it is used if one has an understanding of what is in the memory, what is being communicated out by the computer and what processes and ports are running.

There has been the migration of organization’s data to storage in the cloud at a high rate by various corporations. Many decision makers of technology have invested their businesses in the cloud services. Based on the experience of the organizations, there are three main challenges that one ought to overcome to perform sound data collection in the cloud. Firstly, it is easy to get in, but hard to get back the organization’s data out once it has been drawn to the cloud. Secondly, data protection laws are different in various countries. Thirdly, Office 365, which is seeing a growing adoption among organizations, are inadequate for large-scale collection creating a great challenge for data collection (Barocchini & Maccherola, 2017).

DATA RECOVERY

Reliable methods of data recovery are critical for any Forensic Investigator as the situation of losing data is sometimes inevitable during criminal investigations (Rogers & Seigfried, 2014). For any Cyber Forensic Investigator, information is key and therefore it is highly recommended that measures are put in place to ensure that information can be recovered once lost. In case the information is lost, effective methods of data recovery should be put forward. For example, when one loses a file that he or she has no extra copy of; it would really be easy for them to recover the file if the file were recent and not overwritten. The methods to be deployed in the process of data recovery depend on whether one wants to get the data in in-depth or just a copy of the file. For the case of the whole file, it is possible to recover the file by bookmarking the file as you analyze them bit by bit as you go just like in document forensics (Karie & Venter, 2015). For the case of a copy of the file, computer forensics allows one to get the file from the Image as a stand-alone file.

RECOVERING POTENTIALLY OVERWRITTEN FILES

Digital storage is designed in such a way that when one deletes a file, it stays saved in the digital memory to allow natural restoration of the file. But there is a situation, mainly as a result of disk fragmentation, which could result in this particular data being lost. Fragmentation results in the overwriting of this particular files and it would be possible to recover these files using the file table (Samy et al., 2017). The file table is what determines the way files are stored physically within that particular storage. If the data has been partially overwritten, it will be possible to recover the data by reconstruction of the file header. If the file header has been overwritten, file carving is used (Rogers & Seigfried, 2014).

PASSWORD RECOVERY

Passwords are put in place to ensure data security, and there comes a time when the password itself becomes a threat to data security. For this reason, it is important that measures for password recovery should be in place. The process may be easy or hard depending on the type of password that is being recovered (Bennet, 2012). The easiest way to password recovery is the dictionary. This tool assumes that the passwords are a dictionary and through trial and error the appropriate password is found. After the dictionary attack, hash or password replacement is the next step of password recovery. This case does not apply to all situations given that other systems are complex. If the dictionary attack is not successful in password recovery, then another process called brute force can be used. This process is a widely known password recovery process but is time consuming. The time factor here is determined by the number of possible combination in order to receive the actual password that is required.

FORENSIC IMAGE ANALYSIS

Forensic Image Analysis uses search indexing and file filtering techniques. Index search technique is used in where the data has been grouped into various categories using the index. Digital devices store data using the index for the purpose of aiding people to retrieve data. The file filtering tool, on the other hand, uses hashes to gain access to the necessary files (Karie & Venter, 2015).

The general idea about forensic image analysis lies in the various tools that are used for this challenge. The most used tool is the search tool which includes two types of search. Index search is the easiest form of search that involves the search of the database. When an application is processing the disk for image analysis, it creates then indexes table in the back-end database. Searching of the image will be done through the aid of this particular index. The second technique that is applied is the file filtering. The file filtering tool uses hashes to gain access to the necessary files. This method works by eliminating the undesirable item and select those that the forensic investigator prefers (Simon & Choo, 2014).

CRYPTANALYSIS AND STEGANALYSIS

Steganalysis is the process of finding hidden data within digital objects. This is similar to cryptanalysis applied to cryptography. Information can be hidden in messages, images, or file within another message (Otair, 2015).

The idea of encryption has always been a major obstacle to most of the Cyber Forensic Investigators since they are very hard to break and also due to the fact that not all encryption is the same. The process of encryption is usually done by an application which most of the time leave trails of plaintext behind. These plain texts are hard to find, yet they provide all the necessary requirements to break encryption. The first step towards breaking encryption is to identify the type of application that has been used. Some applications are good in deleting all traces of plain text, but it would be still possible to break the encryption if the plaintext was saved elsewhere of even in another version. The next step is you identify the weakness of the application that has been used for encryption then you exploit the weakness then you can finally access the file if you know the file name (Quick & Choo, 2016).

FORENSIC NETWORK ANALYSIS

Sniffing is the process of analyzing all the data that passes through a given network. Sniffers are available as open-source, commercial and more sophisticated ones (Dykstra & Sherman, 2013). For sniffers to work in a particular network, it must be configured in promiscuous mode allowing them to receive network traffic even if not addressed to this particular Network Interface Cards (NICs) (Gordon, 2016).

BIG DATA

The challenge of big data is to try to isolate the useful data from the vast amounts of data available. In forensics, big data is randomly distributed as compared to simple data, which is stratified, and its analysis requires just simple methods of data mining. After separation of the data, cluster analysis is the step that follows. Cluster analysis involves using a given criteria to try to group the data in an orderly manner depending on the attributes of the data (Rogers & Seigfried, 2014).

The criteria that will be used in the grouping will be up to the efforts of the Cyber Forensic Investigator. Another method that is very vital here is detection, which looks at the data in a perspective which is different from that of the Cyber Forensic Investigator. The last approach is independencies which use some rule to try to find the various relationships of the data that interest the Cyber Forensic Investigator (Gordon, 2016).

SAFE ANALYSIS OF MALWARE

Cyber Forensic Investigators need to identify and if possible, eliminate all imminent dangers posed by malware before analyzing digital evidence. The most common method used for this particular challenge is sandboxing. Sandboxing involves creating a virtual machine on the physical computer that can be operated in the computer as a separate entity (Rogers & Seigfried, 2014).

Which this approach, it will be possible for one to undertake high-risk activities using the virtual machine and deal will eliminate the malware that pose a threat to the work being done by the Cyber Forensic Investigator. According to Samy et al. (2017), the sandboxing tools also have the capability of encapsulating a computer in web-browsing thus providing security from drive-by malware.

DATA VISUALIZATION

A common tool for data visualization in Cyber Forensics is link analysis. This particular tool includes the use of graphs, pie charts, and crosstabs, among others to try to create a visual impression. This is a more practical approach in the field of forensic analysis where it is more interactive and literarily visual (Bennet, 2012).

Ruan et al. (2011) indicate that data visualization entirely depends on the visualization tools possess by Cyber Forensic Investigator meaning that there are many open-source and commercial visualization tools present in the market. The basic idea of data visualization is to aid people to understand the data by seeing the data. (Ruan, Carthy, Kechadi, & Crosbie, 2011).

CONCLUSION

A national workshop found that the most important challenges in Cyber Forensics were education, training and funding, the size of memory, data volume, and understanding of technology (Baggili & Breitinger, 2017). Cyber forensic investigators are very vital in various cases today given that there has been a rapid change in technology over the years. This knowledge is very crucial today especially in court cases where the use of this kind of technology has seen into it that there has been a change in the way various cases that proved hard to make a conclusion be easy.

REFERENCES

  1. Baggili, I., & Breitinger, F. (2017, June 22). NSF National Workshop on Redefining Cyber Forensics. Retrieved 2017, from https://www.youtube.com/watch?v=RBHWVclGmmk&feature=youtu.be
  2. Barocchini, A., & Maccherola, S. (2017, May 31). 3 Challenges to Data Collection in the Cloud. Retrieved 2017, from http://accessdata.com/blog/3-challenges-to-data-collection-in-the-cloud
  3. Bennet, D. W. (2012). The Challenges Facing Computer Forensics Investigators in Obtaining Information from Mobile Devices for Use in Criminal Investigations. Information Security Journal: A Global Perspective , 21 (3), 159-168.
  4. Brown, C. L. (2006). Computer Evidence Collection & Preservation. Massachusetts: Charles River Media, Inc.
  5. Dykstra, J., & Sherman, A. T. (2013). Design and implementation of FROST: Digital forensic tools for the OpenStack cloud computing platform. Digital Investigation , 10, 87-95.
  6. Karie, N. M., & Venter, H. S. (2015). Taxonomy of challenges for digital forensics. Journal of forensic sciences , 60 (4), 885-893.
  7. Quick, D., & Choo, K. (2016). Big forensic data reduction: digital forensic images and electronic evidence. Cluster Computing , 19 (2), 723-740.
  8. Rogers, M. K., & Seigfried, K. (2014). The future of computer forensics: a needs analysis survey. Computers & Security , 23 (1), 12-16.
  9. Ruan, K., Carthy, J., Kechadi, T., & Crosbie, M. (2011). Cloud forensics. IFIP International Conference on Digital Forensics (pp. 35-46). Berlin: Springer.
  10. Samy, G. N., Shanmugam, B., Maarop, N., Magalingam, P., Perumal, S., & Albakri, S. H. (2017). Digital Forensic Challenges in the Cloud Computing Environment. International Conference of Reliable Information and Communication Technology , 669-676.
  11. Simon, M., & Choo, K. (2014). Digital forensics: challenges and future research directions. http://papers.ssrn.com/sol3/papers.cfm?abstract_id=2421339. In I.-S. Kim, & J. Liu, Contemporary Trends in Asian Criminal Justice: Paving the Way for the Future (pp. 105-146). Seoul, South Korea: Korean Institute of Criminology.
  12. Stephenson, P. (n.d.). (ISC)² Guide to the CCFP CBK.

Translate »