HOW COULD DECEIVING TECHNIQUES BE USED AGAINST CYBER ATTACKS

HOW COULD DECEIVING TECHNIQUES BE USED AGAINST CYBER ATTACKS

FADI ABU ZUHRI

INTRODUCTION

Deception refers to actions that are deliberately performed by the senders to make the receiver have a different belief from what the sender considers to be true so as to disadvantage the actions of the sender. It involves planned actions pursued so as to present false information to the attacker making them advance the action that would lead to defense to the computer system (Spafford, 2016). Deceiving techniques are techniques that falsify the perception of reality. These techniques could be deliberate, accidental, or self-induced. Deliberate deception has been used as a defense to the system especially when the deception is intended to disadvantage the attacker. In most situations deception may include the process of hiding the real, dissimulation, displaying the false and simulation. Some of the deceiving techniques used against cyber attacks include masking, repackaging, dazzling, mimicking, inventing, and decoying (Almeshekah, 2015; Spafford, 2016). This article explores how these deceiving techniques may be used against cyber attacks.

MASKING

In masking, the real is masked by ensuring that the relevant object is undetected and in some cases, blended to form an irrelevant background. A privatized message sent to a group email could have a message written in a white font and white background. There are also cases where a malicious JavaScript is embedded in the form of white space in a benign looking JavaScript (Almeshekah & Spafford, 2014).

Masking has been used in situations where the attackers hide the damaging scripts by having the same text and background colour. Hiding the software and services makes it possible for the user to hide the services being run especially when they notice any suspicious activity.

REPACKAGING

Repackaging technique has a role in hiding reality in a way such that an object may be made to appear different from its real self. An example is a situation where repackaging is used when a cyber-attack is made to appear bold with a friendly and official headline to lure the receiver to open the message. In other cases, a remailer made anonymous could be used in replacing the real identification of the sender and the information using an email message.

Repackaging techniques may be used as a defense mechanism. In some cases, the attacker may use repackaging techniques to deceive a user. For example, the Cross-Site Scripting (XSS) uses repackaging technique when a dangerous post is presented as harmless so as to steal the cookies of users when they access such a post. Repackaging has also been used in Cross-Site Request Forgery (XSRF), where some attacker deceives the user into some attractive web pages that silently probes the user into taking part in unwanted activities. Further, repackaging techniques have been used by some attackers that pose themselves as anti-virus software so as to deceive the users into installing them for them to take control of the user’s machines (Almeshekah & Spafford, 2014).

As a defense mechanism, repackaging may create files called “Honey-files” that may be presented as normal files that act as alarms to the system administrators when they are accessed by attackers. Honey-files may also be used by attackers, where enticing names are targeted to the computer systems and act as a beaconing technique to the user who will access those files.

DAZZLING

Dazzling is a technique that induces confusion such as obfuscation and randomization of the identified elements. The technique aids in hiding what is real by ensuring that the process of identifying the object is less certain through the resulting confusion on the true nature. An example is an encrypted channel which makes use of obfuscation by hiding the message despite the clear sent message. Honey-words proposal is a dazzling scheme used to hide real passwords in a list of several fake passwords providing the attacker different passwords to choose from, where a single password would be the true one. If an attacker uses any of the stolen passwords on the system, an alarm would be generated to alert the administrators of the stolen passwords (Juels & Rivest, 2013).

MIMICKING

Mimicking is a simulation technique known to invent the false by imitating the traits of real and relevant traits of an object. For instance, an attack could be associated with some webpage, which may appear valid and similar to a reputable firm, yet the object is a malicious web page established by an attacker.

To offer defense against attackers, mimicking softwares and services may be applied to a system mimicking the response given to another system. For example, the system would respond as though it is Windows XP, yet it is Windows 7. As such, the resources of the attackers would be wasted as they exploit Windows XP instead of Windows 7 (Murphy, McDonald, & Mills, 2010).

INVENTING

The inventing simulation technique deals with the inventing of the false through the creation of the perception that relevant objects may exist, yet in reality, the real object may not be in existence. Inventing simulation techniques have been used in Honeypots where a Honeypot provides the appearance of a subnet machine having specific addresses, when in fact there isn’t any real IP address.

Honeypots have been used widely in several applications that offer security like in the detection of spam and inhibiting the operations of spamming activities. They have also been used in the analysis of malware and securing of different databases. Today, the use of Honeypots is applied in mobile environments. The two major types of Honeypots are the client and server Honeypots. Client Honeypots have been reported to be vulnerable user agents that influence several servers actively so as to detect a compromised system. When a compromised system is detected, the client service will be able to send information to the server regarding the infected users. The server Honeypots, on the other hand, have no vital information and may be created in a way to appear vulnerable so as to entice the attackers into attacking them. The application of Honeypots in security system has been applied in detection of an attack, prevention of attack, in research, and in the provision of the required response (Almeshekah & Spafford, 2014).

In the detection of attacks, Honeypots are used in mechanisms such as intrusion detection systems, which are more accurate in detection than traditional mechanisms (Almeshekah & Spafford, 2014). Honeypots have the ability to generate minimum logging data since they are not used for daily operations, and their interactions tend to be illicit. Shadow Honeypots, for example, have yielded positive results when they were used in the detection architecture. In their operation, sensors that detect anomaly were placed next to the system where decisions were made on the destination of the given request. Several security systems have attempted to integrate Honeypots into the real system by having suspicious traffic moved to the shadow system for more investigation. Honeypots have also been used in the detection of wide attacks on the system.

Studies on the prevention of cyber-attack indicate that Honeypots are useful since they reduce the speed of the attackers and in some case hindering their activities. Dormant IP is an example of a Honeypot that has been used in slowing down the attacker by interacting with them. A study by Chen et al. (2008) reported that the use of deception toolkits might help confuse the attackers, hindering them from reaching the server, and even sending some risks to the attacker’s side. Honeypots use traps and enticements to offer security to the system. Other studies in the field report the use of Honeypots in offering deterrence. Honeypots offer protection to the system by hindering the attacker from access. The success of Honeypots has resulted in the creation of anti-Honeypots mechanisms that use methods that offer deterrence.

Honeypots are effective in offering a response to the system. The independence gained by the use of Honeypots could be easily analyzed and disconnected after a continuous attack on them. A Honeypot system will end up hindering the system of production. In a forensic analysis, Honeypots are useful in the sense that they preserve the state of the attacker on the system giving room for conducting an analysis of what happened (Almeshekah & Spafford, 2014).

In research, Honeypots are used in looking for new types of malware and analyzing them. Depending on the type of attack, it would be possible to develop a security tool that will help improve the security. For example, Honeypots have been used to offer different security signatures. Some of the tools designed to capture the identity of the computer malware include dionea, which stores the identity of malware. Also, Honeypots offer a deep understanding of the common type of attacks.

DECOYING

Decoying is a simulation technique used to attract attention away from the relevant objects. Decoying has been used in a situation where a webpage is given false yet believable information on some basic systems so as to attract the attention of the user away from the source of the real data. In some cases, Honeypots may make the attacker believe that one system of an organization is vulnerable thus capturing the attention of the attackers (Carroll & Grosu, 2011).

CONCLUSION

Deceiving techniques have been used widely in offering protection against cyber attacks. A single process of deception may have dissimulation and simulation techniques that hide the real but making sure that the false is seen by the attacker. The attacker pattern needs to be analyzed to settle on the specific deceiving technique in use. Application of the deceiving techniques discussed in this article will offer the system defense against attackers.

While finding evidence is key, doing it legally is equally important. It is possible that the use of deceiving techniques to catch a criminal may be considered illegal in certain jurisdictions. For example, an intruder could claim the Honeytrap served as an entrapment. Additionally, privacy issues need to be considered (Yasinsac & Manzano, 2002).

REFERENCES

1.Al Kawasmi, E., Arnautovic, E., & Svetinovic, D. (2015). BitcoinBased Decentralized Carbon Emissions Trading Infrastructure Model. Systems Engineering , 18 (2), 115-130.

2.Almeshekah, J. (2015). Using Deception to Enhance Security: A Taxonomy, Model and Novel Uses. PhD thesis, Purdue University.

3.Almeshekah, M., & Spafford, E. (2014). Using Deceptive Information in Computer Security Defenses. International Journal of Cyber Warfare and Terrorism , 4 (3), 46-58.

4.Baur, A. W., Bühler, J., Bick, M., & Bonorden, C. S. (2015). Cryptocurrencies as a disruption? empirical findings on user adoption and future potential of bitcoin and co. In Conference on e-Business, e-Services and e-Society (pp. 63-80). Springer International Publishing.

5.Burniske, C., & White, A. (2017, January). Bitcoin: Ringing the bell for a new asset class. Retrieved 2017, from Ark Invest: http://research.ark-invest.com/bitcoin-asset-class

6.Carroll, T., & Grosu, D. (2011). A Game Theoretic Investigation of Deception in Network Security. Security and Communication Networks , 4 (10), 1162–1172.

7.Chen, X., Andersen, J., Mao, Z., & Bailey, M. (2008). Towards an Understanding of Anti-Virtualization and Anti-Debugging Behavior in Modern Malware. IEEE International Conference on Dependable Systems and Networks, (pp. 177–186).

8.Clinton, P. (2014, March). Driving a Drug Dealer’s Car. Retrieved 2017, from http://www.government-fleet.com/channel/procurement/article/story/2014/03/driving-a-drug-dealer-s-car.aspx

9.Gao, X., Clark, G. D., & Lindqvist, J. (2016). Of Two Minds, Multiple Addresses, and One Ledger: Characterizing Opinions, Knowledge, and Perceptions of Bitcoin Across Users and Non-Users. Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems, (pp. 1656-1668). Santa Clara, California.

10.Juels, A., & Rivest, R. (2013). Honeywords: Making Password-Cracking Detectable. SIGSAC Conference on Computer & Communications Security (pp. 145–160). ACM.

11.Kostakis, V., & Giotitsas, C. (2014). The (A) political economy of Bitcoin. Communication, Capitalism & Critique. Open Access Journal for a Global Sustainable Information Society , 12 (2), 431-440.

12.Lopez, G. (2016, September 24). You are way more likely to be killed by deer than by sharks, bears, and gators combined. Retrieved 2017, from https://www.vox.com/2016/9/24/13032272/killer-animals-deer-sharks-bears

13.Muncaster, P. (2017, June 15). World’s Largest Bitcoin Exchange Bitfinex Crippled by DDoS. Retrieved 2017, from https://www.infosecurity-magazine.com/news/worlds-largest-bitcoin-exchange/

14.Murphy, S., McDonald, J., & Mills, R. F. (2010). An Application of Deception in Cyberspace: Operating System Obfuscation. 5th International Conference on Information Warfare and Security, (pp. 241–249).

15.Rogojanu, A., & Badea, L. (2014). The issue of competing currencies. Case study–Bitcoin. Theoretical and Applied Economics , 21 (1), 103-114.

16.Spafford, E. (2016). Some musings on cyber security by a cyber-iconoclast-UNH Alvine Lecture. Retrieved June 11, 2017, from https://m.youtube.com/watch?v=LPBlCJ0zEJc

17.Yasinsac, A., & Manzano, Y. (2002). Honeytraps, a network forensic tool. Sixth Multi-Conference on Systemics, Cybernetics and Informatics. Orlando, Florida, USA.

HOW ONE OF YOUR VIRTUAL PERSONA COULD WORTH 500,000,000.00 EURO

HOW ONE OF YOUR VIRTUAL PERSONA COULD WORTH 500,000,000.00 EURO

FADI ABU ZUHRI

INTRODUCTION

The advancement of technology has brought about massive change in the lives of people. These developments have greatly affected how they transact and behave online. Many of the activities that were conducted face-to-face have transformed in the virtual world. More and more people have built comprehensive online profiles for them to shop, bank, and connect with friends to the point that they have created a Virtual Identity or Persona of themselves.

An individual’s Virtual Persona allows them to access their credit status, bank balances, engage in gaming, socializing, dating, blogging, etc. This makes your Virtual Identity of immense value to organizations and people. Your online behaviour indicates your buying patterns; your social and financial status attracts certain people who want to befriend you.

Virtual Persona has real value and certain entities may want access it and impersonate in the virtual world. Data derived from the virtual persona has become a source of profiteering legally and illegally. The widespread proliferation of illegal and unrestricted use of private information necessitates the need for effective online Identity Management to create a safe online environment for ecommerce and Internet usage as a whole (Smedinghoff, 2011).

People need to understand that in the virtual world, their online identities have immense value. Earlier, people stored their identity cards in their wallets. Now these are stored online – whether it is your social, legal or financial profile. This means, your Virtual Identity can potentially be stolen electronically. Even something as harmless as online gaming is subject to the same threats. Games such as “World of Warcraft” are termed Massively Multiplayer Online Role-Playing Game (MMORPG) as it engages a huge number of users. The “World of Warcraft” holds the Guinness World Records for the largest monthly subscribers of 11.6 million (Mitchell, 2009). The other most played MMORPG include Final Fantasy, The Elder Scrolls Online, Guild Wars 2, Blade & Soul, Black Desert Online, RuneScape, EVE Online and Star Wars (IG Critic, 2016). Various Augmented Reality games, Pokémon Go for example, also are gaining popularity. Such virtual communities are not immune to cyber attacks.

This paper explores the subject of Virtual Identity, the risk and opportunities of losing them to cyber theft. It reports on how organizations, legally and illegally, are analysing your Virtual Persona and what it could mean to losing accessing your Virtual Identity. The paper focuses on Virtual Reality (VR), Augmented Reality (AR), Analytical Tools and services available to analyse Virtual Identities.

VIRTUAL REALITY: RISK & OPPORTUNITIES

Virtual Reality (VR) describes the world that exists in our minds when we are interacting online. It is the computer-generated artificial environment that users can interact with (Biocca & Levy, 1995). This artificial environment can be experienced via stimuli as sounds and sights afforded by a computer. Virtual Identities are created in VR and represent users in the video games, chat rooms, virtual common space or any other similar environments. These identities aimed at complementing various virtual spaces and platforms are simply referred to as “Avatars” (Morgan, 2009). An Avatar includes a representative video content or image, a profile, a name, or a “handle” that offers more information about an individual’s Virtual Identities.

People create virtue identities by creating virtual representatives of themselves (Rheingold, 1991). In online games, the individual’s Virtual Identity may be part of their identity but may differ from their own identity. In other spaces such as Basecamp, Virtual Identities may be less creatively oriented and represent the user’s actual physical identity, where the user uses their own image or name for an Avatar (Witmer & Singer, 1998).

These virtual platforms pose special risks to users, as they are hubs for Cybercriminals. This occurs because VR technology is built upon existing platforms (Lanier, 1992). As such, it offers little new attack opportunity. At the highest level, VR is largely a new input and display mechanism added to the traditional devices. The technology is powered with underlying computers (whether a mobile, personal computer or console device) that have not really changed much. However, VR facilitates positional and orientation tracking. Physical body movements are tracked. The comprehensive behaviour tracking can be quantified to understand preferences, divert the user’s attention and even sell things (Rubin, 2016). Perhaps, the risk posed by it is not any greater than any other device or software that the user may add to his or her computer.

Today, the use of VR in gaming provides users with a fantasy world that is disconnected from reality. This way, it offers the opportunity to the identity thieves to attack VR and monetize such attacks via social engineering.

Finally, tracking data on online shopping facilitated through VR may allow Cybercriminals to make dangerous attacks. Online shopping provides users with an entirely different VR experience. It allows users to browse items online and even try these items on the Avatar. Unfortunately, the program used can identify a person’s debit card or credit card and Cybercriminals can capture and sell this information.

A Cybercriminal can also use VR/ AR headsets tracker such as web-coding tricks to find valuable information of the user, monitor mouse clicks and movements and use this data in recreating the user actions in a similar way one could mimic the manual pin entry (Fox, Arena, & Bailenson, 2009).

AUGMENTED REALITY: RISK & OPPORTUNITIES

Augmented Reality (AR) describes a series of technologies (i.e., Head-Mounted Displays (HMDs)) that makes it possible for the real-time mixing of content generated via computer with video display (Azuma R. T., 1997). It is used to integrate virtual information into the physical environment of a person making it possible for them to perceive it as existing in their environment (Janin, Mizell, & Caudell, 1993). Its functioning is based on the techniques that was developed in VR and interacts with the virtual world. AR technologies are defined by the following features: (1) interactive in real-time; (2) combining virtual and real; and (3) registered in 3D (Azuma, Baillot, Behringer, Feiner, Julier, & MacIntyre, 2001). This means that these technologies are registered in 3D and interact in real-time. This ensures accurate registration and tracking to ensure the user obtains a believable image. As such, the three key building blocks of AR systems are real-time rendering, display technology and tracking and registration (de Sa & Churchill, 2012).

New mobile wearable computing applications supporting AR functionality are increasingly become possible with the decrease in size and increase in the power of computers making it possible for users to access online services everywhere and always. This flexibility allows applications that enable users to exploit the surrounding context. This AR presents a powerful User Interface (UI) to context aware computing environments (Mekni & Lemieu, 2013). Currently, AR exists in consumer products including Microsoft’s HoloLens, Google Glass, Apple’s iPhone X, Samsung Pixie and games such as Pokémon Go.

AR devices may be prone to attacks and lead to identity theft. For instance, a Cybercriminal using Social Engineering and 3D models can alter and create fake videos and games. Computer scientists and animators have already succeeded in creating the techniques to take the voice recording of a person and make them say something they didn’t. They can give a person different lip movements and expressions by altering the person’s video. This can be achieved by way of tracking a history of movement of a person in VR. While these fake videos are yet to be perfected on, it demonstrates how accurate 3D models and VR tracking could change things. The individual’s unique identifiers could be their physical or verbal “ticks” or unique movements. If compromised, Cybercriminals can use these personal intricacies to digitally impersonate a user or to socially engineer one’s friends (Shatte, Holdsworth, & Lee, 2014).

AR technology was developed over forty years back. Pokémon Go just made AR mainstream. Cybercriminals see AR as an opportunity to execute their malicious intents, and have already seized the opportunity of the popularity of games and various other applications to execute their malicious intents (Zhou, Duh, & Billinghurst, 2008). They have succeeded in creating Windows ransomware, SMS spam, scareware apps, lockscreen apps and apps for purpose of executing their malicious intents. They use fake Windows-based Pokémon Go Bot to attack the users of Pokémon Go Bot. This Pokémon Go Bot application levels the account of the user with little effort by mimicking the role of a fake Pokémon trainer (Paz, 2016).

People are also exploiting Pokémon Go to spread malware to the AR game via bogus guides (Tynan, 2017). Augmented wearable technology pose a serious risk as images in the field of view of a person could be manipulated. These Cybercriminals essentially substitute real virtual objects with fake virtual objects. These AR Cybercriminals could also reinvent a new version of ransomware, which could be used for malicious purposes. By using this new breed of ransomware, these Cybercriminals could make a Doctor who is using Microsoft HoloLens to lose control of it or to pay ransoms. Cybercriminals can also use AR devices to collect personal health data and biometric data and use it for malicious intentions (Boyajian, 2017).

ANALYTICAL TOOLS AND SERVICES

The online technology has generated huge amount of data from video streaming, social media activities, online game playing and browsing in the Internet. These data are accumulating day by day from various sources, through different methods of inputting via different technologies. These data accumulated are called as “Big Data” which is considered to be broad, fast and voluminous. It is either structured or unstructured, but still useful to derive data sets and subsets to sell and utilize by online and non-online companies for increasing market coverage and profits (Tiwarkhede & Kakde, 2015).

Companies engaging in analytic services record and then sell online profiles like user/ screen names, email addresses, web site addresses, interests, preferences, home addresses, professional history, and the number of friends or followers an online user has. There are also companies who gather and synthesize data on the tweets, posts, comments, likes, shares, and recommendations of the user in his social media accounts (Beckett, 2012).

Analytic service and online data industry is reported to be a $300 billion-a-year industry, employing around 3 million people in the United States alone (Morris & Lavandera, 2012). There are a lot of successful companies that provide analytical services and data brokering. These companies, supposedly, know more about you than Google. The list includes Acxiom, Corelogic, Datalogix, eBureau, ID Analytics, Intelius, PeekYou, Rapleaf, and Recorded Future (Mirani & Nisen, 2014). What they do is look into online personal profiles of the users, gathering information like names, friends, activities and interests of those personal profiles and selling them to end users for advertising, marketing and other legitimate economic activities. Basically, it collects information like contact detail, interests, preferences and demographics, then aggregating those information gathered based on a subset needed or applicable to its clients. Acxiom alone has recorded over a billion dollar in revenue for its analytical services involving 144 million US households (Morris & Lavandera, 2012).

Data brokers are intelligent in gathering data and know how to use it. They take advantage of the vast data available online in order to deliver relevant services to users, suggest products and services that the users might need or subliminally suggesting that they need it. These companies claim that all the information gathered and sold is legal, secure and suitable for the users. Data brokers cater to different customers that can range from small enterprises to large Fortune 500 companies (Morris & Lavandera, 2012).

Data brokers source their information from a variety of places. For example, Facebook, Google and other free apps are collecting your data and selling it to those who are willing to pay for it. And then there are Cybercriminals who steal this information and sell on the dark net.

It is scary to think what damage a cyber attack on data aggregators could do. In September 2017, Equifax reported a massive data breach. Initially reported as affective 143 million people, the estimate was revised to 145.5 million later. Cybercriminals accessed consumer’s highly sensitive personal and financial information including names, birthdates, addresses and credit card numbers (Hackett, 2017).

CONCLUSION

The cost of virtual persona of a user is priced depending on its legality, usage and the purpose of its application. Bank details, credit history and the availability of personal documents like driver’s license are seen as high value. Financial Times has presented a calculator to show what each bit of your personal information is worth (Steel, Locke, Cadman, & Freese, 2013). The more is revealed about your real and virtual behaviour, the more valuable your information is. And consider the fact that this information is constantly traded and resold to multiple buyers. It is not difficult to imagine that over the course of your lifetime (or afterlife) your persona may be worth 500 million Euros.

In almost all of the cases the owner of such personal information does not receive the income, or even a tiny share of it, from the revenues generated by the analytics service providers who sell this to willing buyers. The owner themselves are facing risk of breach in security when their information is leaked to undesirable elements who will use their identity to commit fraudulent and criminal activities, leaving them liable for credit fraud or for the unpaid loan that they did not apply for in the first place. The real owner of the personal data faces the burden of proving his/ her innocence.

AR and VR devices are highly complex and relatively new. They are vulnerable and attractive to Cybercriminals looking for the weakest link. Some argue that Cybersecurity’s weakest link are the organization’s own employees (Banham, 2017). Social engineering, as it is also known, is where Cybercriminals deceive their victims and gain their trust. Once the Cybercriminal gains entry, the best protective software turns useless. Therefore, organizations need to invest in on-going Cybersecurity awareness for their employees.

Does it make sense to blame people who are the value creators in organizations? Shouldn’t technical systems be built for normal people rather than techies building systems for techies?

REFERENCES

1.Azuma, R. T. (1997). A Survey of Augmented Reality. Presence: Teleoperators and Virtual Environments , 6 (4), 355-385.

2.Azuma, R., Baillot, Y., Behringer, R., Feiner, S., Julier, S., & MacIntyre, B. (2001). Recent advances in augmented reality. Computer Graphics and Applications , 21 (6), 34–47.

3.Banham, R. (2017, March 20). The Weakest Link In Your Cyber Defenses? Your Own Employees. Retrieved 2017, from https://www.forbes.com/sites/eycybersecurity/2017/03/20/the-weakest-link-in-your-cyber-defenses-your-own-employees/#7815acac5d51

4.Beckett, L. (2012, November 9). Yes, Companies Are Harvesting – and Selling – Your Facebook Profile. Retrieved 2017, from ProPublica: https://www.propublica.org/article/yes-companies-are-harvesting-and-selling-your-social-media-profiles

5.Bimber, O., Raskar, R., & Inami, M. (2005). Spatial Augmented Reality. Wellesley: AK Peters.

6.Biocca, F., & Levy, M. (1995). Communication applications of Virtual Reality. Hillsdale, NJ: Erlbaum.

7.Boyajian, L. (2017, February 27). The 3 biggest challenges facing Augmented Reality. Retrieved 2017, from Network World: http://www.networkworld.com/article/3174804/mobile-wireless/the-3-biggest-challenges-facing-augmented-reality.html

8.de Sa, M., & Churchill, E. (2012). Mobile augmented reality: exploring design and prototyping techniques. 14th international conference on Human-computer interaction with mobile devices and services (pp. 221–23). ACM.

9.Eskelinen, M. (2001). Towards computer game studies. Digital Creativity , 175–183.

10.Fox, J., Arena, D., & Bailenson, J. N. (2009). Virtual Reality: A Survival Guide for the Social Scientist. Journal of Media Psychology , 95–113.

11.Hackett, R. (2017, October 2). Equifax Underestimated by 2.5 Million the Number of Potential Breach Victims. Retrieved 2017, from http://fortune.com/2017/10/02/equifax-credit-breach-total/

12.IG Critic. (2016). Most Played MMORPG Games of 2016. Retrieved 2017, from http://igcritic.com/blog/2016/03/17/most-played-mmorpg-games-of-2016/

13.Janin, A. L., Mizell, D. W., & Caudell, T. P. (1993). Calibration of head-mounted displays for augmented reality applications. (pp. 246–255). IEEE.

14.Lanier, J. (1992). Virtual reality: The promise of the future. Interactive Learning International , 275–279.

15.Mekni, M., & Lemieu, A. (2013). Augmented Reality: Applications, Challenges and Future Trends. Applied Computational Science .

16.Mirani, L., & Nisen, M. (2014, May 27). The nine companies that know more about you than Google or Facebook. Retrieved 2017, from https://qz.com/213900/the-nine-companies-that-know-more-about-you-than-google-or-facebook/

17.Mitchell, B. (2009, June 5). E3 2009: Guinness World Records announces awards at E3. Retrieved 2017, from http://www.ign.com/articles/2009/06/05/e3-2009-guinnes-world-records-announces-awards-at-e3

18.Morgan, G. (2009, July 24). Challenges of Online Game Development: A Review. Simulation & Gaming. (Sage) Retrieved 2017, from Simulation & Gaming: http://research.ncl.ac.uk/game/research/publications/87445d01.pdf

19.Morris, J., & Lavandera, E. (2012, August 12). Why big companies buy, sell your data. Retrieved 2017, from CNN: http://edition.cnn.com/2012/08/23/tech/web/big-data-acxiom/

20.Paz, R. D. (2016, August 24). Pokémon Go Accounts Targeted by Bogus Pokémon Go Bot. Retrieved 2017, from Fortinet: https://blog.fortinet.com/2016/08/24/pokemon-go-accounts-targeted-by-bogus-pokemon-go-bot

21.Rheingold, H. (1991). Virtual reality. New York: Simon & Schuster.

22.Rubin, P. (2016). AR, VR, MR: Making Sense of Magic Leap and the Future of Reality. Retrieved 2017, from https://www.wired.com/2016/04/magic-leap-vr/

23.Shatte, A., Holdsworth, J., & Lee, I. (2014). Mobile augmented reality based context-aware library management system. Expert Systems with Applications , 41 (5), 2174–2185.

24.Smedinghoff, T. J. (2011). Introduction to Online Identity Management. Colloquium on Electronic Commerce .

25.Steel, E., Locke, C., Cadman, E., & Freese, B. (2013, June 13). How much is your personal data worth? Retrieved 2017, from http://ig.ft.com/how-much-is-your-personal-data-worth/?mhq5j=e5

26.Tiwarkhede, A. A., & Kakde, V. (2015). A Review Paper on Big Data Analytics. International Journal of Science and Research , 845-848.

27.Tynan, D. (2017, June 9). Augmented reality could be next hacker playground. Retrieved 2017, from https://www.the-parallax.com/2017/06/09/augmented-reality-hacker-playground/

28.Witmer, B., & Singer, M. (1998). Measuring presence in virtual environments: A presence questionnaire. PRESENCE: Teleoperators and Virtual Environments. Presence , 7 (3), 225–240.

29.Zhou, F., Duh, B. I., & Billinghurst, M. (2008). Trends in augmented reality tracking, interaction and display: A review often years of ISMAR. 7th IEEE/ACM International Symposium on Mixed and Augmented Reality (pp. 193–202). IEEE Computer Society.

Translate »