Richard J. Bocchinfuso

"Be yourself; everyone else is already taken." – Oscar Wilde

FIT – MGT 5115 – Wk 5 Reponses

Question:  Regarding the vulnerabilities, I have had the same concerns that you have mentioned with Open Source applications. Do you believe with Open Source projects with code available to anyone, that having more programmers with access to the code to quickly identify vulnerabilities and correct them outweighs the potential for hackers realizing a vulnerability? I don’t have first hand experience, but from previous classes we learned that programers are normally on a time crunch with approaching deadlines, and therefore neglect security and take shortcuts in the applications design. I also read that the programers will often will change companies, leaving another programer in place to fix the identified vulnerabilities and code errors, often times, with no notes from the out-going programer to help in the process. 


Response:  I think the Open Source conversation cuts both ways.  With source code readily available, vulnerabilities can be identified quicker and either exploited or patched.  There is also a tangential effect of the Open Source movement where tools are being built in the ecosystem which helps us to detect threats and close vulnerabilities, tools like Snort.  Software development cycles are moving at a much faster pace today than they were ten years ago, rigid release cycles have given way to CI/CD (Continuous Integration / Continuous Delivery) and Blue-Green Deployments.  It’s said that 111 billion new lines are code will be put into production in 2017, that is a lot of code and a massive new attack surface, which will likely be targeted using vectors not previously used.  It’s unrealistic to think that all this code will be vulnerability-free, the question in my mind is always focused on progress, if we live in fear, if we slow release cycles, do we reduce risk and at what cost?  I think the Open Source community is critical to the overall ecosystem, yes there are vulnerabilities, for example, Shellshock which impacted a large number of UNIX and Linux based systems using bash and while we might think that tighter controls and release cycles might have avoided this, it’s unlikely.  With all that said I believe the Open Source pros far outweigh the cons.  When we look around at where we are today most of the progress would not have been possible without the Open Source movement.


Question:  So I would like your opinion on a thought process. Which came first, the chicken or the egg? He is what I mean, Lets look at hospitals being held by ransomware. Did this come about from tv shows portraying it then some hacker saying I can do that. Or did it start from a hacker and tv saying what a great idea? Look at how many ideas can from TV and movies and because of fantasy became reality (cell phones, tablets, etc). I’m still new in the IT world, but I don’t ever remember hearing about ransomware attacks on hospitals until after I saw about 3 tv shows with it. Of course I have seen the same trend, not just in hospital ransomware attacks, but other kind of terrorist attacks around the world. So your opinion, are we making hackers famous, or are we giving them ideas? Of course this post is open for anyone to throw their ideas out here on it.


Response:  Scott, my general thought is that art imitates life, life does not imitate art so I believe that TV series like Mr. Robot and others are merely just replaying events which have already taken place in a context that can be easily understood by the masses (Law and Order for the cyber enthusiast).  TV dramatizes the stereotype of a hacker because the truth is probably a little dry for mass consumption but I don’t think TV is providing hackers with any new ideas and most hackers prefer anonymity to fame.  The hacktivist group Anonymous (portrayed as fsociety on Mr. Robot) represents a cyber activist group interested in taking credit (anonymously hopefully) for their activities but the number of hacks they take credit for pale in comparison to the hacks that go undetected or undisclosed.

Interesting fact:  100% of ransomware attacks like (CrytoLockerWannaCry, etc…) decrypt the data once the victim pays the ransom.  These are hacks for economic gain.  If there was a report that the ransom was paid but the data was not decrypted then no one would pay the ransom so ironically the idea of ransomware really hinges on the idea that you will get your data back if pay the ransom.  Couple this with the idea that most organizations don’t want to disclose that they were exploited and you have the perfect storm for a booming business.

The first documented ransomware virus was identified in 1989 and was called the AIDS Trojan.
This week I simulated something similar my presentation as an example of a high-tech method of hacking using a device called a USB Rubber Ducky (video:

As for new ideas discovered on TV, let’s explore this for a minute, maybe using the Star Trek Communicator as a good example.  HAM radio started being used in the 1890s and Star Teck debuted in 1966.  My point is that good Hollywood is rooted in reality, even good science fiction is rooted in the ability to visualize what could be based on what is.  With that said I would be willing to agree that Hollywood probably played a significant role in in design and adoption rates, not sure if this or will continue to be the case in the future though.  The Motorola StarTAC and the Star Trek Communicator look pretty similar, coincidence, I think not.  Hollywood clearly played a role in the design choices and adoption rate of the StarTAC but these are consumer goods and the tech was the tech.

There is a sub-culture out there and when you’re not living it is all seems new and shiny.  John Draper (aka Cap’n Crunch) hacked the pay phones with a toy whistle from a box of Cap’n Crunch cereal box in the 1960s, yet phreaking (the idea not the name) wasn’t really done by Hollywood until 1983 in the movie War Games.  The whistle emitted a 2600 MHz tone that allowed free phone calls to be made from pay phones, though the 70s and 80s phreaking persisted as a vibrant sub-culture where hackers, mostly enthusiast tinkerers but some malicious looked at the ever-expanding telephony system as a gauntlet laid down before them. Sound familiar. 🙂

I am an avid reader of 2600 magazine; if you are interested in the hacker sub-culture I recommend it.
If you just want to read some of the best stories they 2600 had published a couple of books which I recommend:
– The Best of 2600: A Hacker Odyssey
– Dear Hacker: Letters to the Editor of 2600

FIT – MGT 5115 – Wk 5 Discussion Post

Why is cybercrime expanding rapidly? Discuss some possible solutions, including acceptable-use policies, security procedures, and policies.

One of my favorite websites is the Norse Attack Map.
The Norse Attack Map does a good job graphically depicting the amount of suspicious activity occurring on the Internet. I am also an avid reader of Kerbs on Security and it’s clear that hackers are motivated by differing agendas and that attack surface and entry points are increasing at an exponential rate. IOT is creating an unprecedented attack surface and with the number of Internet-connected devices growing exponentially I think it’s fair to expect that cyber attacks will remain on the rise.  Companies like Cisco are introducing what they call “The Network Intuitive” (if this is the only link you click, I suggest watching this video)which will leverage machine learning and AI to protect the network and its connected devices.

Our connected evolution from the ARPANET to the Internet we are all so familiar with and have come to rely on, to the Internet of Everything (IoE) is what provides the basis for the rapid expansion of cybercrime. A quick look at the growth of the Internet and the connected devices provides insight to an attack surface that is growing bigger and bigger with each passing day.

The Target breach was highlighted in the text (Turban, 2015, p. 149) and this was a violation that was a probably avoidable with simple layer one isolation. Why Fazio Mechanical Servies, Target’s HVAC contractor would have credentials on a network that had access to Target’s POS systems is a bit astounding. Hindsight is 20/20 and hackers have proven capable of penetrating facilities which are off the grid; this was the case with Stuxnet. In the case of Stuxnet, a worm purportedly developed under an unacknowledged government operation called Olympic Games which was a campaign to use cyberwarfare to disrupt Iran’s nuclear program. I highly recommend the movie Zero Days.

The Internet is the modern-day battlefield, the keyboard is the weapon of choice, the ideal soldier is adept at sleep deprivation and enjoys jolt cola and cold pizza. Whether you hack for the challenge (e.g. – Kevin Mitnick), hack for hacktivism (e.g. – Barrett Brown), hack for money (e.g. – black, white and gray hat hackers for hire) or you hack for a nation-state, you likely live inside a sub-culture which is which is experiencing exponential growth.

I think it’s important to note that amount of Open Source software being deployed has exploded; this is important because the source code is easily accessible, this makes it easier for hackers to find and exploit vulnerabilities. This software is everything from operating systems like Linux which powers the Internet in the form of servers, mobile devices, IoT devices, routers, switches, etc… to platforms like WordPress which is said to power 28% of the websites on the Internet.  Linux and platforms like WordPress are honeypots because a vulnerability found in the Linux kernel, a GNU binary or in the WordPress code can be exploited and impact maximum damage.  It’s also important to recognize how important simple things like password length and complexity are, tools like hashcat, cloud computing, and the accessibility to GPU computing have made cracking reasonably complex passwords a speedy task, what used to take years now takes minutes.

Cloud computing and rapid adoption have not made these problems any easier to deal with.  As developers race to the cloud to become the next Unicornsecurity practices are weakening.  One of my favorite stories is about a company called Code Spaces who was put out of business by a hacker who gained root access to their AWS account and essentially deleted all their instances, data, and backups.  There are stories every day about developers placing keys on Github inadvertently and there are bots which are actively crawling code repositories looking for keys.  In this connected world access to information awesome and so is the ability to expose information that should not be exposed, good policies, procedures, automation, etc… are required to mitigate risk.

Acceptable use policies and training can be an effective means of influencing how users interact with systems that can either pose a direct or tangential cybersecurity threat.

Security policies and procedures define how to prevent and respond to security incidents. These policies and procedures focus on enforcement, designated and empowered incident response personnel, notification procedures, communication plans and monitoring external sources of information.

Examples of computer security related incidents and items that might be addressed in an acceptable use policy might include items such as:

  • A denial of service attack (DOS, DDOS).
  • Malware infections.
  • Policy violations, such as sharing offensive material, deliberate violation of information security policies, inappropriate use of systems and assets, and unauthorized escalation of privileges or subversion of access controls.
  • A user who defaces another organization’s public website.
  • Unauthorized access is gained to a critical information system.
  • Internal hacking.
  • External hacking, including defacement of websites and malicious intrusion attempts into the internal network.
  • Unauthorized access using VPN or wireless remote access.
  • Abuse of authorized internal and external services.
  • Unauthorized changes to live systems.
  • An event requiring forensic investigation to obtain evidence (e.g. point of entry, compromise of data, etc.).
  • Information systems and assets being used to commit unlawful activity.
  • The actions of third parties who use computer systems to harm the reputation of an organization.
  • Theft of database content.
  • Theft of mobile computing property.
  • Misuse of an employee’s or customer’s personal information.
  • User disclosure of confidential information to external parties.

A security policy might include items such as the following:

  • An incident response plan that serves as a guideline for an overall approach to addressing information security incidents.
  • An intrusion detection procedure that establishes an intrusion detection system and parameters related to maintaining this system.
  • Processes that flow through all phases of a response to an information technology-related incident (preparation, identification, containment, eradication, recovery, and lessons learned).
  • A procedure within the plan that includes classifying an event and assigning a severity rating or priority.
  • Regular reporting requirements for summary reports to management.
  • Provisions for documentation of critical information necessary in the event of an incident and guidelines for all personnel to report observed suspicious activity.
  • Incident management procedures that include a severity rating assignment.
  • Establishment of guidelines for communication of incidents to outside parties.
  • Selection of an incident response team with designated roles and responsibilities.
  • Ongoing and scheduled training for the incident response team.

Outlining what is acceptable use and how to respond to incidents, can reduce risk and improve the ability to contain potential damages should a security incident arise.


Bate, Ben, et al. “WordPress now powers 28% of all websites.” Envato, 5 Sept. 2017, Accessed 27 Sept. 2017.

Finley, Klint. “Linux Took Over the Web. Now, It’s Taking Over the World.” Wired, Conde Nast, 3 June 2017, Accessed 27 Sept. 2017.

“Krebs on Security.” Brian Krebs, Accessed 27 Sept. 2017.

Tung, Liam. “ IoT devices will outnumber the world’s population this year for the first time.” ZDNet, ZDNet, 13 Feb. 2017, Accessed 27 Sept. 2017.

Turban, Efraim, et al. Information technology for management digital strategies for insight, action, and sustainable performance. New Jersey (Estados Unidos), Wiley, 2015.

FIT – MGT 5115 – Wk 4 Assignment

Use a mobile device (smartphone or tablet) to record several examples of organizations using mobile technologies in order to relate to customers, become more efficient, productive, and profitable.

Password: floridatech

FIT – MGT 5115 – Wk 4 Discussion Post

Explain how e-business processes improve productivity, efficiency, and competitive advantage for business organizations and the public sector (government and nonprofit organizations).

I think it it is important to define the “e” in e-business. The “e” stands for “electronic” and implies that the business is networked (i.e. – connected). E-businesses typically make heavy use of technologies such as the internet and electronic data interchange (EDI) to improve productivity, efficiency, and competitive advantage. An e-business applies technology and process to both external and internal business requirements but tends to apply more focus on internal processes as a means to improve productivity, efficiency, cost structures and competitive advantage.

E-businesses are connected businesses who leverage technology to operate effectively in a truly global economy.
These technologies might include:

  • Various wired, wireless and mobile networking technologies
  • APIs (application program interfaces) to streamline how applications talk to each other and exchange information
  • Collaboration and communication tools to like e-mail, Cisco WebEx, Cisco Spark, Slack, etc… are critical in a truly global economy
  • BI (business intelligence) tools
  • CRM (customer relationship management)
  • ERP (enterprise resource planning) systems
  • EDI (electronic data interchange)

These technologies allow organizations to better communicate both internally and externally using empirical data that is both accurate and relevant.
EDI extends this communication ecosystem by connecting disparate buyers, suppliers, and partners. In a B2B context, this helps organizations significantly improve operational efficiencies via the exchange of data that is critical to all stakeholders. This exchange of data can facilitate JIT (just-in-time) inventory strategies where buyers understand the inventory and lead time of suppliers, and suppliers understand potential demand. Exchange of information such as this allows both the buyer and the supplier to streamline their operations making them more efficient, increasing productivity and ultimately makes them more competitive in the market.

The implementation of e-business practices in the public sector can provide similar benefits to e-business practices in the private sector. While the private sector focuses heavily on B2B (business-to-business) applications as well as B2C (business-to-consumer) applications of technology the private sector may be willing to abandon a particular segment of the population if it does not align with their mission and/or vision. A good example of this is Amazon, if you don’t have an internet connected device you likely are not viewed by Amazon as their target market and Amazon is willing to forego you as a customer. Although the launch of brick and mortar Amazon stores, Amazon’s acquisition of Whole Foods and their recent partnership with Kohls might imply that Amazon wants to capture this segment of the market. The public sector has a responsibility to ensure that the services they are providing are available to everyone. The public sector has more of a B2C (business-to-consumer) and C2C (consumer-to-consumer) focus, and the use of websites, social media, and electronic communications has changed the way in which citizen access information, the information available to citizens and the information transparency. Public sector organizations capture a significant amount of data on citizens, or members and this data can now be better aggregated and analyzed allowing public sector organizations to serve their constituents more efficiently and cost-effectively. Public sector organizations have to contend which challenges that the private sector can afford to ignore, simply stating this is not a target market.

The lines are very blurry and becoming more blurred with each passing day.  Many technical disruptors are building both B2B and B2C platforms, data captured from both business partners as well as consumers make the platforms more robust and meaningful to all the stakeholders.  I think about the companies like Uber and their relationship with drivers and the riders.  The Uber to driver relationship is a B2B relationship while the relationship between Uber and the rider is a B2C relationship.  This same paradigm exists with Amazon and the Amazon marketplace, and there are many more examples.  What’s clear is that as we become more connected the productivity, efficiencies, new entrants, opportunities, etc… are exponential, not linear.


Bartels, A. (2000, October 30). The difference between e-business and e-commerce. Retrieved September 21, 2017, from

Ripley, H. -. (2014, January 29). How e-business transforms public sector services in the UK. Retrieved September 21, 2017, from

Turban, Efraim, et al. Information technology for management digital strategies for insight, action, and sustainable performance. New Jersey (Estados Unidos), Wiley, 2015.

FIT – MGT 5115 – Wk 3 Assignment

You will create a PowerPoint presentation to address the question below. Your PowerPoint presentation should be between 8-12 slides, and developed as if you are presenting to fellow colleagues within the IT industry. 

What are the functions of databases and/ or data warehouses?  Present examples from an office environment or other industry with which you have personal experience (i.e.: health field, accounting, fitness environment, academic institutions), that illustrates these functions (billing, customer searches, etc.)

[google-drive-embed url=”” title=”FIT MGT 5115 Week 3 Presentation” icon=”” width=”100%” height=”400″ style=”embed”]

FIT – MGT 5115 – Wk 3 Discussion Post

How does data quality impact business performance? Using your textbook as a resource, describe the functions of database technology, the differences between centralized and distributed database architecture, how data quality impacts performance, and the role of a master reference file in creating accurate and consistent data across the enterprise.

When poor data quality such as missing and/or erroneous data negatively impacts operations, it can cost organizations business, affecting revenues and profits.  Missing and/or erroneous data can affect current revenues and can frustrate customer and place an organizations reputation at risk putting both existing and future business at stake. Data quality issues can decrease efficiency and increase costs, lack of confidence in data integrity causes organizations to spend time and money on data validation and error correction activities.

The goal of a database is to store data in a structured way (maybe).  Two popular database architectures are SQL and NoSQL databases.  SQL or RDBMS (Relational Database Management Systems) are relational with a defined schema. NoSQL databases or document databases are often schemaless and rely on key-value pairs defined at ingest.  As you can imagine ingesting (or inserting) data into a specified schema makes managing data integrity easier than defining the key-value pairs at the time of ingest.

Centralized and distributed database architectures are quite intuitive.  Centralized database architectures centralize the storage and control of data while distributed database architectures allow data to be stored on edge devices such as laptops, tablets, and mobile devices or distributed using master/master, master/slave or parent/child relationships.  Centralized database architectures offer greater control of data quality and security because all data is stored in a single physical location thus adds, updates and deletes can be made in a supervised and orderly fashion.  Centralized database architectures also allow for better security. It is easier to control physical and logical access to a centralized architecture, and the attack surface is limited when contrasted with a distributed system.

Centralized and distributed database architectures each come with tradeoffs which should be considered when selecting an appropriate architecture.  With more and more processing being pushed to the edge (e.g. – mobile and IoT growth) and with ever increasing big data demands decentralized distributed databases like Apache Cassandra and RethinkDB are experiencing massive growth.  Centralized databases like Microsoft SQL Server, MariaDB and others are still very prominent, but even these centralized database players are trying to adapt their architectures to support distributed database architectures to capitalize on the big data revolution.

Master reference files provide a common point of reference and act as a single source truth for a given data entity. Data entities might include customer, product, supplier, employee or asset data. As a single source of truth, master reference files are used to feed data into enterprise systems and maintain data quality and integrity.


Buckler, Craig. “SQL vs NoSQL: The Differences — SitePoint.” SitePoint, SitePoint, 18 Sept. 2015, Accessed 14 Sept. 2017.

“Do You Know How Data Quality Impacts Your Business?” BackOffice Assicates, 23 July 23ADAD, Accessed 14 Sept. 2017.

Turban, Efraim, et al. Information technology for management digital strategies for insight, action, and sustainable performance. New Jersey (Estados Unidos), Wiley, 2015.

FIT – MGT 5115 – Wk 2 Discussion Post

Why do managers and workers still struggle to find the information that they need to make decisions or take action despite advances in digital technology? That is, what causes data deficiencies?

Corporate infrastructure and decision support systems have evolved over decades. Over this same period, organizations have endured management changes, shifting priorities and differing perspectives on the role of IT. Data silos, lost or bypassed data, poorly designed interfaces, nonstandardized data formats and chronically in flux requirements further compound the natural system and organizational challenges brought on by progress,

In my opinion, organizations can begin to combat some of the corporate infrastructure and organizational behavior issues by having a clear vision and mission when it comes to information systems. Management changes will happen, but an organization that has a clear vision and mission regarding the value of data and information will stay focused on strategic objectives amidst regime change. Everyone within the organization should be viewed as a stakeholder and a benefactor. There is an education process that needs to take place; all the parties concerned need to have the right reaction to the blue button moment. Data silos, poor user interface design, etc… persist because a wrong choice is made when a blue button moment occurs. The ability to changes the future depends on the decisions we make now.

The wrong blue button moment:
System doesn’t love embedded images so here is a link:

Source: Alex Cowan – Getting Started: Agile Meets Design Thinking, University of Virginia

The right blue button moment:
System doesn’t love embedded images so here is a link:

Source:  Alex Cowan – Getting Started: Agile Meets Design Thinking, University of Virginia

I believe we are lowering the barrier to entry when it comes to how we transform data into information. For years the industry spent time trying to force data into a common data model for business intelligence (BI), this normalization process usually consisted of one or more ETL (extract, transform, load) jobs. These jobs were typically batched, and the end state was a normalized data set pushed into a relational database management system (RDBMS), the relational SQL database schema was rigid and comprised of tables consisting of columns, rows, and fields. We called these DSS (Decision Support Systems) data warehouses and data marts. Fast forward a few years and many of these information systems which leveraged historical data as the primary predictor of the future are pivoting towards NoSQL databases where key-value pairs have replaced SQL relationships. NoSQL information systems are meant for massive real-time ingest; these systems are being used to build data lakes. The ability to use key-value pairs removed the need for a rigid schema often removing the need for an ETL process. The field of Data Science, NoSQL platforms like Hadoop, applications like Elastic Search for indexing, Kibana for visualization and programming languages like RPythonOctave, etc… make capturing data and performing analytics easier than ever before. With the advent of public cloud and platforms AWS EMRAWS Data PipelineGoogle Cloud DataflowGoogle Cloud Dataproc and many of the issues like data silos, lost or bypassed data, poorly designed interfaces, and nonstandardized data formats are being addressed.  The technology is being adapted to address matters that have persisted for a very long time; these new technologies are streamlining processes, increasing the time to value and solving some of the issues mentioned above.


Data Lake vs. Data Warehouse: Is the warehouse going under the lake? (2016, July 22). Retrieved September 06, 2017, from

NoSQL vs SQL- 4 Reasons Why NoSQL is better for Big Data applications. (2015, March 19). Retrieved September 06, 2017, from

Turban, E., Volonino, L., & Wood, G. R. (2015). Information technology for management digital strategies for insight, action, and sustainable performance. New Jersey (Estados Unidos): Wiley.