Advising and Investing for Growth http://brianmurrow.com RSS feeds for Advising and Investing for Growth 60 http://brianmurrow.com/Blog/tabid/123/ID/4112313/Five-must-do-initiatives-for-the-global-banking-system-Cyber-risk-wake-up-call.aspx#Comments 0 http://brianmurrow.com/DesktopModules/DnnForge%20-%20NewsArticles/RssComments.aspx?TabID=123&ModuleID=637&ArticleID=4112313 http://brianmurrow.com/DesktopModules/DnnForge%20-%20NewsArticles/Tracking/Trackback.aspx?ArticleID=4112313&PortalID=0&TabID=123 Five must-do initiatives for the global banking system – Cyber-risk wake-up call http://brianmurrow.com/Blog/tabid/123/ID/4112313/Five-must-do-initiatives-for-the-global-banking-system-Cyber-risk-wake-up-call.aspx The FireEye security breach discovered last week underscores the diligence required to secure the global financial system in its current fragile state. The malware-infected software was downloaded by an estimated 18,000 customers which undoubtedly resulted in thousands of compromised networks. Although the banking regulators have already proposed requirements to supposedly address the enhanced risks brought about by last week’s events, it falls squarely on every financial services institution to ensure the security of their own infrastructure and collectively the integrity of the global financial system. There has already been plenty of discussion on the far-reaching implications of the current infestation. Nonetheless, I want to focus on the five must-do initiatives to maintain the integrity of our global financial system. Incident Response: The key to successful response is having a well-defined and communicated incident response plan. Proper planning and action will help organizations facilitate the detection, response, and recovery components necessary to quickly and appropriately respond to a breach or incident. Although it has been determined that the malware was designed with a kill switch that is activated once the communication to the botnet server domain was removed, this does not mean that we are out of the woods. As my friend and colleague Steve Ursillo has written extensively on this topic, it is not uncommon for an attacker’s tactics, techniques, and procedures (TTPs) to broaden their access and foothold by compromising multiple layers of the systems, networks, and trusted environments through privilege escalation, pivoting, and lateral movement. As a result, it is critical that organizations conduct a proactive and timely incident response to ensure the confidentiality, integrity, and availability of their data, systems, and environments.  Cybersecurity Risk and Gap Assessment: Organizations should have a formalized cybersecurity risk assessment methodology that includes FFIEC, NIST, PCI, NCUA, and SOC standards. It is a common mistake to think of these as perfunctory regulatory requirements. These should be thought of as using proven frameworks to understand the true residual risk of an asset. If executed appropriately and taken seriously such evaluation will prioritize efforts and resources to support increased controls maturity in areas that require enhanced safeguards to protect covered assets and data. Steve Ursillo has a methodology he uses which is a solid pre-emptive defense approach. Third-Party and Vendor Risk Management: With the popularity of managed services, technology outsourcing, and cloud solutions, firms need to be more vigilant than ever to ensure that third parties are actively assessing and providing proof of the proper cybersecurity risk management protocols. Ensuring that appropriate supply-chain risk assessments are conducted to minimize the risk of a supply-chain attack has become critical. In addition, verifying that the appropriate third parties are receiving the coverage of a sufficient cybersecurity report to validate their cybersecurity maturity becomes a necessary element in this process. From an IT perspective, minimizing third party risk starts with tools that include SOC, PCI, HITRUST, CMMC, and ISO. In addition, there are ways of monitoring third party risk using advanced AI using both commercial and open source tools. In a future blog, I’ll highlight an effective a number of effective tools developed for third party risk, including one developed by a team of researchers at The George Washington University. Identity and Transaction Monitoring: The implications of this breach on the global banking system will likely involve an epic level of identity and account theft. Although individual consumers will need to be vigilant with regard to monitoring their account and credit data, it will fall on the global financial institutions to ensure the integrity of its transaction-related data. The traditional counter-fraud processes rarely provide ongoing, real-time reviews of the identity or account-level data outside of the one-time KYC and traditional rules-based transaction monitoring processes. Tools such as IBM’s Safer Payments and Verituity enable the utilization of AI in identity and transaction monitoring. In future blogs I’ll dive further into these capabilities. Business and accounting controls: A major problem I have observed and that has also been highlighted in a series of very recent OCC Enforcement Actions is the lack of codification of the data, technology, and financial controls required to ensure the secure operation of the banking system. Many of the major financial services organizations have tens of thousands of controls in their controls inventories and GRC systems. It may seem like if they have so many controls then all the risks must be covered. Instead, resources are being wasted on testing duplicative controls leaving other controls, such as cyber security and data privacy untested or even missing. As a starting point, I often end up recommending a thorough assessment, rationalization, and fortification of financial services organizations’ controls. This usually results in saving costs and lowering the organization’s risk profile. For more ideas and a solid methodology, my friend and colleague Stephen Masterson has written extensively on this topic. Over the next few weeks I will dive into a few of the above topics in some more actionable detail. Brian Sun, 20 Dec 2020 22:10:00 GMT f1397696-738c-4295-afcd-943feb885714:4112313 http://brianmurrow.com/Blog/tabid/123/ID/2467784/Impact-Alpha-There-need-not-be-a-tradeoff-between-profits-and-social-impact.aspx#Comments 0 http://brianmurrow.com/DesktopModules/DnnForge%20-%20NewsArticles/RssComments.aspx?TabID=123&ModuleID=637&ArticleID=2467784 http://brianmurrow.com/DesktopModules/DnnForge%20-%20NewsArticles/Tracking/Trackback.aspx?ArticleID=2467784&PortalID=0&TabID=123 Impact Alpha: There need not be a tradeoff between profits and social impact http://brianmurrow.com/Blog/tabid/123/ID/2467784/Impact-Alpha-There-need-not-be-a-tradeoff-between-profits-and-social-impact.aspx This month marks the 50th anniversary of the publication of Milton Friedman’s seminal article in the New York Times, “The Social Responsibility of Business Is to Increase Its Profits.” Therefore, I thought this week would be a good time to share the work that I have been working on with teams of students from my graduate-level business analytics class at The George Washington University. Through our analysis of social impact and portfolio returns, we have found the premise of Friedman’s thesis is negated in that: Social impact can be measured through the utilization of big data analytics There need not be a tradeoff between investment returns and greater social impact Specifically, in the Healthcare Sector, higher social impact often leads to higher financial returns Over the course of the past year, we have been applying the current evidence-based ESG concepts to the development of a new methodology that utilizes the combined power of Natural Language Processing (NLP) and Machine Learning (ML) to: Value firms by applying advanced ML techniques to fundamental financial analysis Quantify firms’ current and potential social impact. We call this methodology “Impact Alpha”. The Impact Alpha methodology was designed to empower investors and consumers to outline a series of social impact objectives and tailor their investment portfolio or corporate support accordingly. Given the general alignment of social impact and the healthcare sector, we decided to focus on developing an AI-based portfolio selection model to evaluate publicly traded healthcare firms selecting an analytically optimal portfolio of investments with both the highest estimated returns and social impact. Although we decided to focus on the healthcare sector prior to the outbreak of COVID-19, as the pandemic spread and the performance of the global economy declined, it became increasingly obvious that this methodology could not only be utilized to select an investment portfolio but to surveil global big data to evaluate products and firms that will ultimately support “flattening the curve”. Impact Alpha applies a score between -1 and 1 to every publicly-traded healthcare firm where the absolute value measures the level of the impact and a positive score implies a relative net benefit to society and a negative score implies relative harm to the society. Furthermore, each firm has an Impact Alpha score in each of the six categories: Saving Lives Curing & Treating Life Threatening Disease Improving the Lives of the Elderly Lowering Costs Enabling Improved Coordination Embracing Value-Based Care These broad categories were borrowed from research compiled by EntryPoint Capital. More detail on these categories can be found in their white paper. In backtests, our model calculated the Impact Alpha for each month between 2010 and 2020 with publicly available unstructured data available prior to each period, such as news stories and journal articles. At a high-level, the steps of the model include: News Scraping: Evaluating millions or articles and disambiguating and extracting firm name and selecting articles about US-based healthcare firms. News Classification: Classifying topics within the articles into one or more of the six social impact categories. Sentiment Score: Calculating the sentiment score for each topic identified within each article. Calculate Final Score: Users of the methodology can weight each of the six Impact Alpha categories for their personal objectives. In comparing the Pre-COVID era with where we are now, the Impact Alpha model has largely reallocated the entire “portfolio” to the prevention, care, and treatment of COVID-19-related firms. For example: Doubled its allocation in drug companies, with a focus on the firms that have been in the news regarding vaccine developments Doubled both its COVID-related bio-medical and medical services allocations Removed its dental supplier and nursing home allocations The purpose of this discussion is not to recommend specific stocks or sub-sectors but to discuss the concept of an innate alignment of social responsibility, corporate earnings, and financial returns. We are continuing to surveil and report on the data and refine the methodology. Please reach out if you are interested in discussing the results and/or contributing to the ideation. Brian Mon, 21 Sep 2020 15:47:00 GMT f1397696-738c-4295-afcd-943feb885714:2467784 http://brianmurrow.com/Blog/tabid/123/ID/22620/Keeping-Economic-Stimulus-out-of-Criminal-Hands-Requires-Lenders-Taking-Preventative-Measures.aspx#Comments 0 http://brianmurrow.com/DesktopModules/DnnForge%20-%20NewsArticles/RssComments.aspx?TabID=123&ModuleID=637&ArticleID=22620 http://brianmurrow.com/DesktopModules/DnnForge%20-%20NewsArticles/Tracking/Trackback.aspx?ArticleID=22620&PortalID=0&TabID=123 Keeping Economic Stimulus out of Criminal Hands Requires Lenders Taking Preventative Measures. http://brianmurrow.com/Blog/tabid/123/ID/22620/Keeping-Economic-Stimulus-out-of-Criminal-Hands-Requires-Lenders-Taking-Preventative-Measures.aspx On March 17, the Coronavirus Aid, Relief and Economic Security Act, known as the CARES Act, became law. CARES contains $376 billion in relief for American workers and small businesses. These funds were allocated to several new temporary programs. Most of these funds will initially be administered by the U.S. Small Business Administration (SBA) but the ultimate responsibility for distribution of these funds to recipients is in the hands of our country’s established financial services sector. And as additional programs are formed, such as the Fed’s $2.3 trillion loan program announced this week, there will be additional pressure on the US banking system to quickly and effectively distribute funds to American businesses. Unfortunately, if history is any guide, the criminal world is hard at work determining how to take these funds from well-deserving American workers and small businesses through such means such as: Application Fraud: Corporate and individual identity theft to apply for fraudulent relief. Account Takeover: Gaining direct access to small business and individual checking, credit, and/or debit card accounts to directly steal the Federal loans and grants. Industry experts, including the Government Accountability office expect significant fraud coming out of the implementation of the CARES Act – which could be as high as high as $60 billion, or one to three percent of the total funds in any given program. For example, to date, prosecutors have recovered over $11 billion in fraud resulting from criminals taking advantage of the 2008 $400 billion Troubled Asset Relief Program (TARP). It is estimated that the actual unrecovered TARP fraud is many factors higher. In today’s economic climate, these statistics should make us, as Americans, very angry. Under normal circumstances, one expects a certain level of criminal activity. But in a global public health crisis, with unemployment soaring, every dollar allocated by Congress, for hardworking American employees and businesses needs to get into the hands of their intended recipients. We need to do everything we can to ensure that happens. In spite of the fraudsters, these economic stimulus programs are critical to the recovery of the American economy. To ensure that we are avoiding this money getting into criminal hands, we need to move away from a “pay and chase” stance (the industry standard) to proactive fraud prevention programs that intercept and intercede fraud before and as it is happening. Fortunately, in 2020, unlike in 2008, advanced artificial intelligence and big data tools have become mainstream in the fight against financial crime and can work in real-time to shut down fraud before it happens. For example, the GAO estimates that by using real-time analytics, the IRS prevented more than $6.5 billion in fraudulent refund returns between 2015 and 2017. Since the distribution of CARES Act funds to businesses is not in the hands of the Federal government but instead in the hands of the private sector, it is imperative that banks and our country’s lending institutions utilize the power of big data and artificial intelligence to prevent improper payment of these precious CARES Act funds. I lead a team at IBM that focuses on counter-fraud advisory and solutions for the North America financial services sector. We are currently teaming with banks and other financial services institutions to help CARES Act funds get into the hands of the intended recipients and avoid the fraud trap. We make it our daily mission to shut down fraudulent payments before they happen. To do so, we have put in place and are currently delivering two quick-to-market AI-based solutions. Fraud Asset Management System (FAMS) is IBM’s solution that identifies fraudulent loan and grant applications to root out criminals at the registration process before payments are made, to avoid the “pay and chase” process typical of application fraud. Safer Payments is IBM’s AI-powered counter-fraud payment and management solution that monitors billions of transactions with millisecond, real-time performance. With stimulus funding protected by IBM Safer Payments, financial institutions and payments firms are safely facilitating payments to stimulus recipients, integrated with lenders' existing payments solutions. Whether one takes advantage of IBM’s counter-fraud advanced AI capabilities or find other effective means, it is imperative for our recovery, as a nation, that we prevent funds from the CARES Act and any additional stimulus funding from getting into the hands of the criminals.  Brian Sun, 12 Apr 2020 21:41:00 GMT f1397696-738c-4295-afcd-943feb885714:22620 http://brianmurrow.com/Blog/tabid/123/ID/22619/How-Business-Leaders-Can-Prepare-for-COVID-19s-Transformational-Impact.aspx#Comments 0 http://brianmurrow.com/DesktopModules/DnnForge%20-%20NewsArticles/RssComments.aspx?TabID=123&ModuleID=637&ArticleID=22619 http://brianmurrow.com/DesktopModules/DnnForge%20-%20NewsArticles/Tracking/Trackback.aspx?ArticleID=22619&PortalID=0&TabID=123 How Business Leaders Can Prepare for COVID-19's Transformational Impact http://brianmurrow.com/Blog/tabid/123/ID/22619/How-Business-Leaders-Can-Prepare-for-COVID-19s-Transformational-Impact.aspx Over the past few weeks the world, as we know it, has changed: It is estimated that at least 40% of the global population will eventually contract COVID-19 The stock market (S&P 500) is down 16.1% year to date (up from yesterday’s low of negative 23.2%) The US will most certainly go into a recession Unlike prior recessions of the past 30 years, this crisis did not start in the financial or housing markets. This is a global health crisis with dramatic financial and economic implications. In addition, given the runway needed for readiness of testing and vaccines, we need to be prepared for an indefinite period of disruption. As business leaders, we need to treat the cause and not the symptoms. This will be hard to do. We need to understand that this is a permanent transformation, not just a short- to medium-term continuity of operations problem. As illustrated above, without protective measures, this health crisis will quickly overwhelm the capacity of our healthcare system. With protective measures, it may be possible for the current healthcare system to meet demand (i.e., slowing the spread) and/or while ultimately increasing capacity. At this point, it is clear that any Federal, state, or local government intervention that occurs will not be enough to slow the spread of COVID-19, by itself. Therefore, to minimize the business impact, it is up to us, as business leaders to step up and slow the spread amongst our employees, customers, and society. What this means is that as business leaders, we can have a measurable and direct impact on the spread of COVID-19 and its affects on our employees and customers. It is our responsibility to do so. And we can achieve this while also doing right for our business’ bottom lines and minimize the COVID-19 business impact and that of the forthcoming recession.  What can we do now as business leaders? 1) Integrate distance working into the mainstream Social distancing: Ensure constant productivity, regardless of time-shifting and location-shifting. School closures: Employees need flexibility in work schedule to ensure caring for families as school closures continue. Workforce tooling: Creating an atmosphere or environment of accountability and collaboration and providing the appropriate tools to support the disbursed work environment. Workforce participation: To continue to get done the day-to-day business and drive earnings, firms need to continue to maximize workforce participation while meeting the changing environment. Customer interaction: Driving communication campaigns to encourage customer-facing teams and customers to accept remote and automated customer interaction. 2) Embrace artificial intelligence (“AI”) in everyday workflow Enhance productivity: To drive productivity in uncertain economic times, integration of existing AI tools into broader series of workflows to enhance top-line business capabilities, driving. Reduce Cost: The utilization of existing AI tools can enable cost reduction while further enhancing productivity.  Minimize person-to-person contacts: AI tools are mainstream in consumer product industries (i.e., Apple and Google Home). Utilizing similar capabilities in the business environment can minimize person-to-person contact while enhancing customer service – especially during high-demand times like we are experiencing (have you tried to call an airline or bank recently?)  3) Enhance policies and controls from regulatory-focused to risk-focused geared for the 2020’s Assess operational risk disruptions: During prior financial crises, regulators stepped in to evaluate systemic risk impacts. Although there are new regulations in place, such as the “Request for Assurance of Operational Preparedness Relating to the Outbreak of the Novel Coronavirus” (and I expect more), businesses need to self-assess the true and relevant business impacts and utilize real-time monitoring and controls. We need to look at this as a business risk mitigation activity and not a compliance exercise.  Assess business and market risk: Optimize portfolio composition and risk return profiles for impacted products, sectors, and companies Minimize the impact of credit deterioration: Review and enhance credit risk identification and loan work out capabilities. Due to enhanced AI and machine learning capabilities, we now have capabilities to do this proactively and should begin immediately – before this becomes a problem. How do we get started in making change today? Get uncomfortable: Some industries, such as technology, have embraced remote working and the integration of AI for over 20 years. But for the rest of the world, these are new concepts and will take some getting used to as business leaders get uncomfortable. Embrace business transformation: At an academic level, we have known that business needed to embrace these changes. But at a practical level, the business-as-usual scenario has dominated. Those emerging from this crisis as successful leaders will embrace this as an opportunity to step up and embrace these transformational aspects. Act Quickly: The exponential nature of the spread of COVID-19 requires us all to act quickly. The transformations that are required are not new or unproven – but I believe that it is a fear of the underlying reasons of the changes (in this case COVID-19) that leads to the denial that comes across as a fear of change. We have no choice but to act quickly – if not for the well-being of our employees then for the certain business impact of inaction. What can we do differently as business leaders? Embrace Outsourcing: Smart outsourcing is an opportunity to transition employees to environments that are more conducive to remote work environments, utilizing advanced IT and AI tools, while both saving costs and increasing customer service. Establish remote workforce: In addition to outsourcing, providing existing employees the IT and AI tools to work smarter while working safer will result in increase productivity that will yield bottom line results. Integrate AI: AI is nothing new but it has been slow to adopt into everyday work processes. By integrating AI into both customer-facing and back-office processes, businesses can yield cost savings, increase employee satisfaction, and enhance customer experience. Review Risk and Controls framework: Historically operational and financial risk and controls have been geared toward financial market driven issues and traditional continuity of operations remediations. Businesses should be prepared for a longer-term impact of this health crises and adjust their controls horizon scanning appropriately. In future blogs over the coming days, my colleagues and I will detail specific actions that can be used to create an operational risk roadmap for COVID-19-related transformation readiness. Please feel free to engage in the dialogue. There will be no easy answers but through solid ideation and rapid action -- we can make a difference. Brian Fri, 13 Mar 2020 16:34:00 GMT f1397696-738c-4295-afcd-943feb885714:22619 http://brianmurrow.com/Blog/tabid/123/ID/22618/Top-Risk-and-Compliance-Trends-for-2019-in-Financial-Services.aspx#Comments 0 http://brianmurrow.com/DesktopModules/DnnForge%20-%20NewsArticles/RssComments.aspx?TabID=123&ModuleID=637&ArticleID=22618 http://brianmurrow.com/DesktopModules/DnnForge%20-%20NewsArticles/Tracking/Trackback.aspx?ArticleID=22618&PortalID=0&TabID=123 Top Risk and Compliance Trends for 2019 in Financial Services http://brianmurrow.com/Blog/tabid/123/ID/22618/Top-Risk-and-Compliance-Trends-for-2019-in-Financial-Services.aspx In the years since the financial crisis, there has been a significant focus within the Financial Services Sector on conservative, organic growth while managing risk and compliance expectations. In recent months, the Sector has been experiencing a shift. The rapid adoption of Fintech and Regtech, combined with the ever-increasing pressure on cost reduction, has created an opportunity for leading practice, established firms to adopt an information-led strategy, for rapid growth and market share, while effectively managing risk and compliance. Further opportunity arises as upstart Fintech’s stumble in their attempts to gain market share from the well-established banks and diversified financial services firms (firms). Below, I highlight a few of the leading trends that are rapidly emerging in 2019. Data to Support Insight-driven Banking Over the past few years, most financial services firms have focussed on ensuring the quality of their data used for regulatory reporting purposes. As a result of this hard work and investment, these firms are well-positioned to monetize these data through the application of advanced and cognitive analytics. Firms are discovering that their risk and regulatory compliance reports rely on the same quality, governed data and customer master data required for top-line expansion. Specific opportunities include enhancing: ·       Customer retention programs, identifying customers likely to move part or all of their business to other firms. ·       Cross and upselling, identifying current customers who are candidates for expanding their business through additional products and/or services. ·       Predictive lending, identifying customers who may be ready to purchase or refinance a home.    As firms embark on this journey, fortifying their proprietary in-house data is a unique competitive advantage of the established players relative to the emerging Fintechs. As a result, we are seeing leading practice firms accelerating their journey to monetize their internal data through utilizing existing advanced cognitive analytics tools, such as IBM’s Customer Insights for Banking and other unique accelerators.   Data Privacy Programs Stories on the front pages of the Wall Street Journal about data theft, data breaches and the improper uses of customer data are becoming all-too-frequent.  Without trust in the institution and its ability to safeguard, manage, and use data appropriately – business is lost. In addition to these business drivers for data protection, the regulators are actively enforcing GDPR and California’s Privacy Law goes into effect on January 2020. It is critical that organizations protect the privacy of its customer data and that information be managed to the highest of standards. To achieve this, leading practice financial services firms proactively address data privacy through: ·       Development of data protection strategy ·       Safeguards for sensitive data ·       Continuous monitoring of data access ·       Stringent privacy policies, to which the firm audits and adheres   To facilitate these leading practices, firms are utilizing services and tools to help ensure customer trust and regulatory compliance. Payment Fraud Prevention through Cognitive Analytics As the number of online consumer payment options propagate, so does the opportunity and incidence for payment fraud. Global criminal networks are increasing their adeptness in exploiting the weakness in law enforcement, regulatory oversight, and consumer naiveté about the complex payments landscape.  Stolen credit card numbers are available for $5 each on the dark web. Entire buildings overseas house the manufacture of fake cards. Leaders of criminal networks operate from overseas, with only the occasional in-country “mule” getting caught. Unfortunately, payment providers are largely left on their own, to not only devise methods for fraud prevention, but also to determine the nature of fraud in the first place. To meet these challenges, leading practice financial services organizations address their fraud problems proactively through: ·       Cognitive, self-learning, self-correcting detection capabilities to facilitate rapid adaptation of detection and prevention algorithms for rapidly evolving criminal patterns and behavior. ·       Intelligent insight into cross-channel data and patterns to detect complex fraud and shut down related accounts across those channels.  ·       Intelligence sharing amongst traditionally competing financial institutions as they realize that they can achieve net benefit by sharing information about criminal networks and fraud patterns not only through trading ideas at conferences, but also through utilities that provide near real-time intelligence. Fortunately, there are suites of tools that facilitate financial services firms in adopting these leading practices. For example, IBM’s Safer Payments is an IBM platform that supports proactive, cognitive capabilities for payment fraud alerts. Trade and Conduct Surveillance Financial Services firms face major challenges in improving trade and conduct surveillance output and mitigating risk, thereby reducing losses, fines, and reputation risk. Threats posed by innovative bad actors inside major financial services firms are increasing at the global level. As a result, financial Services firms need to surveil and monitor trading activity and the communications between employees and customers taking place around that activity. While traditional structured data may be relatively simple to query and analyse, communication data, on the other hand, comes in a variety of unstructured formats including voice, email, instant messages, even video files, therefore is much harder to unpack and analyse for relevance. Complicating matters is that the sources of the surveillance data required to identify the bad actors needs to be integrated from multiple sources. It is not simply transaction and communications data, but also Product Control, HR, AML, and KYC data that helps identify individuals of concern. To meet these challenges, established financial services firms are embracing the integration of digital, data, analytics, and risk management into all facets of their business and operating model, applying a holistic cognitive approach to trade surveillance to detect, profile, and prioritize complex trading scenarios. These approaches leverage advanced and cognitive analytics to learn from post-event investigations and consider external events into risk factor analysis. Using cognitive analytics, a unified surveillance system builds out risk scenarios based on multiple risk factors and exposures to those risk factors to identify deviations from normal patterns of behavior, as they occur. For example, a breach in a trader’s limits correlated to a decision not to take annual leave might become of greater interest to supervisors. By utilizing a cognitive system which can both "remember" and effectively "reason" through a known violation type, compliance officers can achieve greater efficiency by reviewing low-level alerts in aggregation and empower better decisions with the use of more data in context. To facilitate these leading practices, there are a number of leading practice tools, such as IBM’s Surveillance Insight for Financial Services. Digital Audit, Compliance, and Risk Monitoring and Testing To manage the ever-increasing risk and compliance challenges faced by today’s financial services firms, there has been a proliferation of new audit and regulatory compliance tests and risk monitoring requirements. As a result, there is a proliferation of manual processes, inconsistent access to data, partially documented controls environment, and a lagging reaction to new and/or changing regulations. The traditional approach to address this increased workload has been to hire additional staff – taking on exponentially higher costs. Leading practice financial services firms are utilizing enhanced digital capabilities to streamline the audit, compliance testing, and surveillance environments, across all three lines of defence. These firms are implementing: ·       Automated workflow tools for the development and implementation of enhanced controls and testing ·       Robotic process automation tools to automate and accelerate rote testing activities ·       Advanced and cognitive analytics to automate complex human reasoning, that is consistent and doesn’t get tired   The utilization of these digital tools enables firms to move from sample testing to full population review. This results in decision-makers having access to real-time information that provides deeper insights for timely decision-making. Anti-Money Laundering and Know your Client New and increasingly similar regulations across jurisdictions have rapidly recast AML from a standalone compliance function to a progressively complex, overarching operation affecting or incorporating legal, risk, compliance, operations, and tax departments. To meet these challenges, proactive financial services organizations are: ·       Integrating cross-organization data, from a traditional siloed approach by combining insights from multiple detection systems, including KYC, suspicious activity monitoring, and watchlist filtering. ·       Using advanced and cognitive analytics, to transform risk and compliance management and improving decisioning, from regulatory change management to specific compliance processes. Regulators are beginning to expect that institutions incorporate improved technology in their AML programs. ·       Adopting a risk-based approach, to modernizing transaction monitoring systems, to reduce the operational overload from legacy systems with high false positive rates. ·       Implementing back-office process automation, to drive operational efficiency and refocus resources from mundane tasks. To facilitate these leading practices, it is recommended that traditional AML/KYC compliance workflow tools be integrated with other leading practice tools, such as IBM’s Alerts Insight with Watson. Why Addressing These Trends is Important? Depending on the firm’s current position, there are two primary reasons to address these trends in 2019: ·       Top-line expansion: Many traditional firms have addressed the underlying regulatory and risk issues embedded within these trends. Therefore, the highest return payoff is monetizing the investments made in regulatory compliance and risk. There is huge potential top-line expansion waiting to be tapped in firms’ existing data. ·       Increased regulatory scrutiny: Firms that have not addressed the fundamentals of risk management and regulatory compliance will be under increased scrutiny, by both US and foreign regulators, as well as customers. This heightened risk lens is becoming more important as Fintechs experience missteps with customers and markets and interest rates continue to their volatility. In future blogs this year, I will further detail these trends, providing specific techniques and examples of remediating these risks while using these solutions for top-line expansion opportunities. For additional discussion on the use of advanced and cognitive analytics in, please click here. Brian Wed, 02 Jan 2019 01:27:00 GMT f1397696-738c-4295-afcd-943feb885714:22618 http://brianmurrow.com/Blog/tabid/123/ID/22617/Risk-Compliance-Empowered-by-Cognitive.aspx#Comments 0 http://brianmurrow.com/DesktopModules/DnnForge%20-%20NewsArticles/RssComments.aspx?TabID=123&ModuleID=637&ArticleID=22617 http://brianmurrow.com/DesktopModules/DnnForge%20-%20NewsArticles/Tracking/Trackback.aspx?ArticleID=22617&PortalID=0&TabID=123 Risk & Compliance, Empowered by Cognitive http://brianmurrow.com/Blog/tabid/123/ID/22617/Risk-Compliance-Empowered-by-Cognitive.aspx In the ten years since the financial crisis, there has been a significant focus within the Financial Services industry on conservative, organic growth while managing risk and regulatory expectations. Over the past 18 months, the industry has been experiencing a shift. The rapid adoption of Fintech and increased pressure on cost reduction has created an opportunity for leading practice firms to adopt an information-led strategy using Cognitive Business Decision Solutions (CBDS) tools to achieve rapid growth and market share while effectively managing risk and compliance.   At IBM, we are excited to be working with our clients to provide Risk and Compliance Solutions, utilizing our industry-leading advanced and cognitive analytics solutions.   More information Brian Sat, 22 Dec 2018 16:16:00 GMT f1397696-738c-4295-afcd-943feb885714:22617 http://brianmurrow.com/Blog/tabid/123/ID/20393/Inaugural-Smith-Analytics-Conference.aspx#Comments 0 http://brianmurrow.com/DesktopModules/DnnForge%20-%20NewsArticles/RssComments.aspx?TabID=123&ModuleID=637&ArticleID=20393 http://brianmurrow.com/DesktopModules/DnnForge%20-%20NewsArticles/Tracking/Trackback.aspx?ArticleID=20393&PortalID=0&TabID=123 Inaugural Smith Analytics Conference http://brianmurrow.com/Blog/tabid/123/ID/20393/Inaugural-Smith-Analytics-Conference.aspx Join me and my colleagues from the University of Maryland's Smith Business School on April 21, 2017 for the Inaugural Smith Analytics Conference. In the afternoon I will be discussing the impact of Digital Labor Analytics on the financial services Risk and Regulatory space. Looking forward to seeing you there.  Brian Thu, 20 Apr 2017 14:30:00 GMT f1397696-738c-4295-afcd-943feb885714:20393 http://brianmurrow.com/Blog/tabid/123/ID/16620/Heightened-Expectations-for-Banks-around-Risk-Governance-Across-the-Three-Lines.aspx#Comments 0 http://brianmurrow.com/DesktopModules/DnnForge%20-%20NewsArticles/RssComments.aspx?TabID=123&ModuleID=637&ArticleID=16620 http://brianmurrow.com/DesktopModules/DnnForge%20-%20NewsArticles/Tracking/Trackback.aspx?ArticleID=16620&PortalID=0&TabID=123 Heightened Expectations for Banks around Risk Governance: Across the Three Lines http://brianmurrow.com/Blog/tabid/123/ID/16620/Heightened-Expectations-for-Banks-around-Risk-Governance-Across-the-Three-Lines.aspx In a recent publication in the Global Association of Risk Professionals, my colleagues and I discuss how large financial institutions — those with more than $50 billion in assets — are prioritizing their practices to adhere to the Office of the Comptroller of the Currency’s (OCC) Heightened Standards, calling for the development and implementation of a Risk Governance Framework. Specifically, banks need to enhance their risk management structures, frameworks, and processes to comply with the OCC’s heightened expectations, which entails several moving parts. For more details regarding leading practices on how banks are successfully deploying the risk governance framework within both the daily activities of financial institutions’ risk management functions and across their front line units, please take the time to read through the attached blog, Risk Governance: Across the Three Lines. ___ This article represents the views of the author only, and the information contained herein is of a general nature and is not intended to address the circumstances of any particular individual or entity. No one should act on such information without appropriate professional advice after a thorough examination of the particular situation. Brian Mon, 21 Mar 2016 16:08:00 GMT f1397696-738c-4295-afcd-943feb885714:16620 http://brianmurrow.com/Blog/tabid/123/ID/16539/Risk-Data-and-Reporting-Career-Opportunities.aspx#Comments 0 http://brianmurrow.com/DesktopModules/DnnForge%20-%20NewsArticles/RssComments.aspx?TabID=123&ModuleID=637&ArticleID=16539 http://brianmurrow.com/DesktopModules/DnnForge%20-%20NewsArticles/Tracking/Trackback.aspx?ArticleID=16539&PortalID=0&TabID=123 Risk Data and Reporting Career Opportunities http://brianmurrow.com/Blog/tabid/123/ID/16539/Risk-Data-and-Reporting-Career-Opportunities.aspx In this blog space, I typically discuss various leading edge examples of how financial services organizations and the financial regulators are using Big Data Analytics to advance their organizations’ mission. Today, I am going to invite folks to be a part of the excitement. At KPMG, my team is doing extensive hiring in all areas of Risk Data and Reporting using Big Data Analytics and are looking to hire people of all levels of experience in the following areas: Risk Data and Reporting Architect Risk Data and Reporting Analyst So if you or someone you know is looking to make a difference, challenge the status quo, and build expertise using leading edge Big Data Analytics and reporting tools and techniques, please click one of the links above, explore the opportunities, apply, and drop me a note. Brian Fri, 04 Mar 2016 00:37:00 GMT f1397696-738c-4295-afcd-943feb885714:16539 http://brianmurrow.com/Blog/tabid/123/ID/16310/Ten-Key-Regulatory-Challenges-Facing-the-Financial-Services-Industry.aspx#Comments 0 http://brianmurrow.com/DesktopModules/DnnForge%20-%20NewsArticles/RssComments.aspx?TabID=123&ModuleID=637&ArticleID=16310 http://brianmurrow.com/DesktopModules/DnnForge%20-%20NewsArticles/Tracking/Trackback.aspx?ArticleID=16310&PortalID=0&TabID=123 Ten Key Regulatory Challenges Facing the Financial Services Industry http://brianmurrow.com/Blog/tabid/123/ID/16310/Ten-Key-Regulatory-Challenges-Facing-the-Financial-Services-Industry.aspx In the spirit of “New Year Prognostications,” at KPMG, we are predicting the ten key regulatory themes that we expect will demand the attention of financial services organizations in 2016. These themes are a mix of both new and ongoing issues. The complexity of the regulatory environment and regulators’ continued focus on governance and accountability are likely to keep these issues “front and center” as the United States moves into a major election cycle. The ten key regulatory themes include: Strengthening governance and culture Improving data quality, data aggregation, and data reporting Merging cybersecurity and consumer data privacy Accommodating the expanding scope of the consumer financial services industry Addressing pressures from innovators and new market entrants Transforming the effectiveness and sustainability of compliance Meeting challenges in surveillance, reporting, data, and control Reforming regulatory reporting Examining the interplay between capital and liquidity Managing the complexities of cross-border regulatory change. For more details on the above themes, please take the time to read through the attached paper, Ten Key Regulatory Challenges Facing the Financial Services Industry in 2016. ___ This article represents the views of the author only, and the information contained herein is of a general nature and is not intended to address the circumstances of any particular individual or entity. No one should act on such information without appropriate professional advice after a thorough examination of the particular situation. Brian Wed, 06 Jan 2016 03:31:00 GMT f1397696-738c-4295-afcd-943feb885714:16310 http://brianmurrow.com/Blog/tabid/123/ID/15969/Advances-in-Big-Data-Analytics-for-Financial-Services-Organizations.aspx#Comments 0 http://brianmurrow.com/DesktopModules/DnnForge%20-%20NewsArticles/RssComments.aspx?TabID=123&ModuleID=637&ArticleID=15969 http://brianmurrow.com/DesktopModules/DnnForge%20-%20NewsArticles/Tracking/Trackback.aspx?ArticleID=15969&PortalID=0&TabID=123 Advances in Big Data Analytics for Financial Services Organizations http://brianmurrow.com/Blog/tabid/123/ID/15969/Advances-in-Big-Data-Analytics-for-Financial-Services-Organizations.aspx Big Data analytics is transforming the financial services landscape. Over many years, the focus of big data analytics has been in the client acquisition and marketing side of the financial services industry. But over the past few years much progress has been made in utilizing big data analytics techniques, such as natural language processing, or NLP As I have described in earlier blog posts, traditionally unstructured data was too complex for common statistical programming languages to analyze. Therefore, in order to gain meaningful interpretations from unstructured data, people had to manually sort through various and voluminous sources of data to draw conclusions. With limited resources and an overwhelming quantity of data, this method has become inefficient and costly. As a result, a series of tools and methods have emerged over the years that enables analysts to process and analyze both structured and unstructured data to gain valuable insights and help guide business decisions. NLP refers to techniques that allow text to be analyzed and identify trends, patterns, and statistically relevant findings through mining thousands of data sources. Using NLP techniques, analysts can discover unexpected correlations or anomalies for businesses to investigate and can analyze at a higher level than previous technologies would allow. Businesses are increasingly using NLP to react faster to changes in data and make better business decisions. Organizations can benefit from the use of NLP in various industries such as banking, insurance, healthcare, and law enforcement. In an article published this month in CFO magazine by David Katz, “Big Data, Smaller Risk” he explores the role of big data through a series of use cases impacting financial services. In particular, David gives me the opportunity to describe in detail a use case that I have implemented in the financial services industry with regard to monitoring borrower support using NLP. ___ This article represents the views of the author only, and the information contained herein is of a general nature and is not intended to address the circumstances of any particular individual or entity. No one should act on such information without appropriate professional advice after a thorough examination of the particular situation. Brian Mon, 05 Oct 2015 20:35:00 GMT f1397696-738c-4295-afcd-943feb885714:15969 http://brianmurrow.com/Blog/tabid/123/ID/15790/2016-Risk-Data-Aggregation-Deadline-Approaching.aspx#Comments 0 http://brianmurrow.com/DesktopModules/DnnForge%20-%20NewsArticles/RssComments.aspx?TabID=123&ModuleID=637&ArticleID=15790 http://brianmurrow.com/DesktopModules/DnnForge%20-%20NewsArticles/Tracking/Trackback.aspx?ArticleID=15790&PortalID=0&TabID=123 2016 Risk Data Aggregation Deadline Approaching http://brianmurrow.com/Blog/tabid/123/ID/15790/2016-Risk-Data-Aggregation-Deadline-Approaching.aspx In January of 2013, the Basel Committee on Banking Supervision (BCBS) issued a set of 14 principles for Risk Data Aggregation and Reporting (RDAR) and risk reporting, known as BCBS 239. BCBS 239 seeks to improve the risk management and decision making process at banks by improving how each bank defines, gathers, and processes risk data and how it measures performance against risk tolerances. The set of 14 RDA principles are intended to address what the BCBS sees as one of the most significant lessons learned from the financial crisis that began in 2007 – the inability of banks to quickly and accurately identify risk exposures and risk concentrations at the bank group level, across business lines and between legal entities. My KPMG colleagues and I provide a summary overview of the looming deadline in the report 2016 Risk Data Aggregation Deadline Approaching. Over the next few months in a series of blogs and reports, we will provide in depth analysis of leading practices in RDAR, from the perspective of the regulators’ expectations and the financial services operating models. ___ This article represents the views of the author only, and the information contained herein is of a general nature and is not intended to address the circumstances of any particular individual or entity. No one should act on such information without appropriate professional advice after a thorough examination of the particular situation. Brian Sat, 22 Aug 2015 18:36:00 GMT f1397696-738c-4295-afcd-943feb885714:15790 http://brianmurrow.com/Blog/tabid/123/ID/13915/A-picture-or-visualization-is-worth-a-thousand-words.aspx#Comments 0 http://brianmurrow.com/DesktopModules/DnnForge%20-%20NewsArticles/RssComments.aspx?TabID=123&ModuleID=637&ArticleID=13915 http://brianmurrow.com/DesktopModules/DnnForge%20-%20NewsArticles/Tracking/Trackback.aspx?ArticleID=13915&PortalID=0&TabID=123 A picture – or visualization – is worth a thousand words http://brianmurrow.com/Blog/tabid/123/ID/13915/A-picture-or-visualization-is-worth-a-thousand-words.aspx In our last Conversation on Big Data we discussed the importance of integration, of designing data repositories so that facts are organized by a set of data elements that provide business context. Actually, good design is as important for a successful quantitative analytics program as it is to finding the perfect Italian suit, or pair of shoes, or furnishing your home. The IBM Center for The Business of Government and the Partnership for Public Service sponsor our Conversations on Big Data because we think there is value in communicating creative uses of quantitative analytics. This blog takes that value proposition further – to show the value in creativity itself to quantities analytics. Whether you are building a sports car or an analytics program, good design should be your mantra. Some of you may be skeptical that there is a relationship between design as in a Ferrari Testarossa and design in data repositories and quantitative analytics. Yet there is a mingling of art and science in both. The polarization of “Art” and “Science” is a 20th century concept. Many of the great Renaissance masters were skilled at mathematics and physics as well as their art. Charles Darwin’s sketches are as artistic as a Picasso etching. So it should not be a surprise that good analytics require good design. This is even more important for the output of quantitative analytics as for the data repositories that provide the input. The best way to sound a death knell for your analytics program is to present your findings in ways that fail to communicate effectively to your audience. Think back to the last time you were at a meeting where someone read out loud a set of slides with very little expression to their voice. How long was your interest held? What did you take away? Odds are your answers are “not long” and “not much.” Traditional reporting techniques are the equivalent of reading wordy slides in a monotone. A much more effective way to present your findings is to leverage data visualization. Duke University defines data visualization as: “an umbrella term, usually covering both information and scientific visualization. This is a general way of talking about anything that converts data sources into a visual representation (like charts, graphs, maps, sometimes even just tables)”[i][1] To use data visualization effectively, there are a few common guidelines[2]. The first one is know your audience. Are they familiar with your program and its goals? Do they understand how quantitative analytics works? What are they expecting and how do they assimilate information? Once you’ve defined whom you are presenting your findings to, the second guideline is create a framework. If your audience includes folks who have no knowledge of what your analytics program is about, or analytics at all, include some background. Even if your audience is educated about your program and analytics, you still must relate your visualization to the data in some way. Bubble charts, for example, can express data points according to many dimensions. The size of the bubble, its color, its position on the x and y axes can be leveraged to communicate, but your audience needs to know what those combinations mean. The third guideline is to tell a story with your data and your charts. Don’t approach your task as simply showing a group of findings backed up by numbers. Tell your audience the tale by starting at the beginning, proceeding through the middle and concluding with the punch line you want them to take away. Once you have your story structured, you can think about what visualizations are most effective for each part. Many experts refer to the 3C’s of Visualization: 1.       Clarity: The ability to quickly understand what data the visual is displaying, and how it is displaying it. 2.       Connectivity: How well the visualization connects disparate data points. 3.       Concentration: How well the visualization brings certain (sets of) data points forward and focuses the viewer on them.[3] Steve Beltz, the Assistant Director of the Recovery Operations Center, shared some examples of how data visualization is leveraged at the Recovery, Accountability and Transparency Board, which monitors how ARRA awards are distributed and spent. The Transparency is provided on the ARRA web site, where anyone can click on a map of the US and drill down into the awards granted to ensure that the money is being used as stated in the award.[4] Steve says that “the software … is actually representing the data in graphics … if you picture a landscape and you see these mountain ranges of different data points, the larger mound is something that gets talked about a lot … where these little teeny tiny mounds are only referenced once or twice … but those are actually what we are interested in, and we’ll zoom in on those and find out what’s being said in that area.” A picture that lets someone instantly identify their particular area of interest, zoom in and expand or drill down into it, and gain knowledge about it is truly worth a thousand words – or slides! Check out past Conversations on Big Data for more tips and insights on using big data to improve your organization’s mission effectiveness. [1] http://guides.library.duke.edu/content.php?pid=355157&sid=2904817 [2] https://hbr.org/2013/04/the-three-elements-of-successf [3] http://online-behavior.com/analytics/effective-data-visualization [4] http://www.recovery.gov/arra/Transparency/RecoveryData/Pages/RecipientReportedDataMap.aspx?stateCode=FL&PROJSTATUS=NPC&AWARDTYPE=CGL ___ This article represents the views of the author only, and the information contained herein is of a general nature and is not intended to address the circumstances of any particular individual or entity. No one should act on such information without appropriate professional advice after a thorough examination of the particular situation.   Brian Tue, 17 Feb 2015 01:25:00 GMT f1397696-738c-4295-afcd-943feb885714:13915 http://brianmurrow.com/Blog/tabid/123/ID/13914/Integration-makes-Big-Data-Useful-Data.aspx#Comments 0 http://brianmurrow.com/DesktopModules/DnnForge%20-%20NewsArticles/RssComments.aspx?TabID=123&ModuleID=637&ArticleID=13914 http://brianmurrow.com/DesktopModules/DnnForge%20-%20NewsArticles/Tracking/Trackback.aspx?ArticleID=13914&PortalID=0&TabID=123 Integration makes Big Data Useful Data http://brianmurrow.com/Blog/tabid/123/ID/13914/Integration-makes-Big-Data-Useful-Data.aspx Big Data is a term that is bandied about quite a bit these days. One sees references to Big Data on television commercials, in pop-ups on web sites, and in print advertisements. Have you ever stopped to think why “Big Data” has Big Data become the latest technical buzz word? Of course, I think Big Data is much more than a buzz word, and so do The Partnership for Public Service and the IBM Center for The Business of Government, who have sponsored our Conversations on Big Data because we think there is value in communicating how public agencies are using quantitative analytics creatively to better deliver services to their constituencies. We have seen several examples of Big Data at work in this blog. What we need to remember is that Big Data is not just a vast sea of facts and figures, pulled together using sophisticated toolsets, and backed up by text and images. Big Data gives us the ability to tie all those data points and documents together to form a picture or snapshot. What’s special about Big Data is integration. The importance of integration was a key theme of our Conversation with Lori Walsh, Chief of the Center for Risk and Quantitative Analysis at the Securities Exchange Commission (SEC). Lori described her group as the “center of a hub-and-spoke system” that centralizes information from regional offices across the country. The trading of securities in the US is subject to a set of laws that require transparency. Transparency is achieved by the collection of data for each and every transaction, and about each and every participant. According to Lori, the value provided through analytics is “pulling together the pieces of the puzzle. If you just have a big pile of pieces, you can’t tell what the picture is. When you start organizing the data, you can start putting the pieces of the puzzle together and … a picture starts to emerge.” How do you accomplish that organization so that you can integrate all those pieces of the puzzle? The first step in integrating data is always discovery. What exactly are all these little pieces? In terms of a securities transaction, the pieces are: 1.)    The Buyer and the Seller 2.)    The Security being traded and how many Units of the Security are being exchanged 3.)    The Amount the Buyer is paying the Seller 4.)    The Date upon which the transaction occurred All of these data elements can be precisely defined and documented. For example, Buyers and Sellers are typically financial institutions registered with the SEC who place trades according to instructions received from investors hoping to increase the value of their portfolio of securities. Defining data elements in business terms is critical to providing context to any analysis of the data itself. That is called business metadata. The second step in integration is collection. Once we have discovered and defined the pertinent data points we are dealing with, we need to collect them into a single repository. The “hub and spoke” structure that Lori referred to provides a metaphor for building a repository of integrated data. Continuing with our example, as securities transactions occur, they are documented by the Seller, the Buyer, whatever exchange in which they occur, etc. We now have many records of our transaction, all of which are captured in data repositories. To make that data meaningful we need to organize those repositories by determining which of those data elements provide context. Look again at our example of a securities trade. Within our dataset, there are numbers (Amount and Units) and one date. The other data elements are “Who” (Buyer and Seller) and “What” (the Security traded). When we think of our data elements in that sense, we can start to put thinking of our transaction as “Who is buying What When” or “Who is selling What When?” Organizing our repository by looking to our business metadata to determine our context enables us to start querying – asking questions – that are relevant to our data consumers. Over time, we have built up many large data repositories with well defined content that has been organized in order to provide answers to common business questions. That is one level of data integration, but now let’s takes a look at the next level. Lori Walsh provided us an example of how the SEC leverages integration across numerous data repositories. “… perhaps we have a tip about … insider trading. We have done … an examination of the broker dealer who was named in the tip … We have previously done investigations on that broker but never brought an action against him or her, and by pulling this information together, … we’re able to scan across our databases and see … information bubble up immediately that gives more credibility to the tip. So it is having … the data in place and the pieces organized so that when a new piece comes in you can immediately see where it fits and how it associates with the other information that we already have.” This is why Big Data is much more than a buzz word. Big Data provides the ability to integrate data from different repositories that are organized differently and have widely varying content but contain at least one commonly defined data element (in this example the broker), that enables us to pull all that disparate data out, look at where it intersects, and then connect the dots to make reasonable, fact-based conclusions. Check out past Conversations on Big Data for more tips and insights on using big data to improve your organization’s mission effectiveness. ___ This article represents the views of the author only, and the information contained herein is of a general nature and is not intended to address the circumstances of any particular individual or entity. No one should act on such information without appropriate professional advice after a thorough examination of the particular situation. Brian Tue, 10 Feb 2015 00:59:00 GMT f1397696-738c-4295-afcd-943feb885714:13914 http://brianmurrow.com/Blog/tabid/123/ID/13913/Is-Big-Data-the-same-as-Dirty-Data.aspx#Comments 0 http://brianmurrow.com/DesktopModules/DnnForge%20-%20NewsArticles/RssComments.aspx?TabID=123&ModuleID=637&ArticleID=13913 http://brianmurrow.com/DesktopModules/DnnForge%20-%20NewsArticles/Tracking/Trackback.aspx?ArticleID=13913&PortalID=0&TabID=123 Is Big Data the same as Dirty Data? http://brianmurrow.com/Blog/tabid/123/ID/13913/Is-Big-Data-the-same-as-Dirty-Data.aspx Conversations on Big Data is a series of discussions about using analytics in creative and interesting ways that the Partnership for Public Service and the IBM Center for The Business of Government designed to broaden the perspective about quantitative analytics. Wikipedia defines Big Data as “…an all-encompassing term for any collection of data sets so large and complex that it becomes difficult to process using traditional data processing applications”[1]. Other sources support that definition:  “Big data usually includes data sets with sizes beyond the ability of commonly used software tools to capture, curate, manage, and process data within a tolerable elapsed time.”[2] Big data "size is a constantly moving target, as of 2012 ranging from a few dozen terabytes to many petabytes of data”.[3] Those who have been working with analytics for many years may see an application of Moore’s Law in the evolution of applications that can now generate data on this scale. Yet, those very same analytics users have been dealing with the challenges of defining and implementing data governance standards to ensure, among other things, a high level of data quality and integrity.  How do we ensure the veracity needed for credible analytics while at the same time leveraging Big Data, which by definition is a massive amount of raw data?   According to the officials who have participated in our Conversations, dirty data is better than no data. Dean Silverman of the Office of Compliance Analytics at the IRS says that “there’s an endless amount of data and if you don’t know where you’re going, any road will take you there.” In other words, simply taking the time to decipher and understand the structure of your data can provide insight and lead to deeper, more structured analytics.  However, if you need to address a specific issue or problem, Dean says that, too, “starts with the data.” The data governance best practices referred to above that were developed to maintain the quality of structured data can be extended to unstructured data, such as PDF documents and image formats. Hadoop, an open source framework developed to support Big Data, is used to capture unstructured data and “mine” for patterns that can be used to build a data “map.” That map can provide the basis to integrate the data into an enterprise model and potentially determine correlations and relationships. Over time, Big Data tools store the history of those patterns and can continuously apply them as that huge volume of variable data in a variety of formats continues to stream into the organization at a high velocity. Variations in those patterns can be detected as the data stream is in motion, offering potential opportunities to influence behavior in real-time. The patterns detected within the raw data itself provide the standards for evaluation.   The insight required to apply analytics to Big Data is acquired through processes similar to more traditional business intelligence methods. First you have to acquire the data and store it, so that you can analyze it and propose a working model of its structure. Once the data is acquired and modeled, it can be managed according to data governance best practices. That working model will include history regarding patterns in data that can be applied in real-time to those data sources and streams. As variations occur the working model is refined until the user community is confident that they fully understand that model. Then the data can be transformed and integrated into enterprise repositories. The working model of a data stream can also be used to identify real-time opportunities to influence behavior by initiating predefined actions triggered by occurrences of specific variations in the patterns.     Visualization tools enable access to Big Data. They enable analysts to take note of the variations and refine their data and logic models accordingly. This is an extension of the same principles that financial services companies have been using for over a decade to promote greater use of credit cards by finding patterns in customers’ spending habits that are used to drive marketing campaigns. Changes in the patterns indicate opportunities; to understand the opportunities and communicate the benefits they can provide, you need to first understand the data and the patterns.   In conclusion, for all its Volume, Variability, Variety, and Velocity, we discovered that leveraging Big Data requires good, old-fashioned data management best practices. Before you can effectively use data as a resource, you must understand it. Data Profiling is not a new concept, but it is critical when it comes to Big Data, because it cannot be done manually. Automated data profiling feeds information to Big Data analysts that improve their maps and working models of the data.  Other Big Data tools continuously scrutinize incoming data streams, refining the models and furthering understanding of the content. As understanding grows, data management standards for Big Data can be developed and applied to ensure that Big Data is intelligently integrated into traditional data repositories at an enterprise level while simultaneously leveraging its power in real time.      Check out past Conversations on Big Data for more tips and insights on using big data to improve your organization’s mission effectiveness. [1] Wikipedia [2] Snijders, C., Matzat, U., & Reips, U.-D. (2012). ‘Big Data’: Big gaps of knowledge in the field of Internet. International Journal of Internet Science, 7, 1-5. http://www.ijis.net/ijis7_1/ijis7_1_editorial.html [3] Ibrahim Abaker Targio Hashem, Ibrar Yaqoob, Nor Badrul Anuar, Salimah Mokhtar, Abdullah Gani, Samee Ullah Khan, The rise of “big data” on cloud computing: Review and open research issues, Information Systems, Volume 47, January 2015, Pages 98-115, ISSN 0306-4379, http://dx.doi.org/10.1016/j.is.2014.07.006 ___ This article represents the views of the author only, and the information contained herein is of a general nature and is not intended to address the circumstances of any particular individual or entity. No one should act on such information without appropriate professional advice after a thorough examination of the particular situation. Brian Wed, 04 Feb 2015 00:44:00 GMT f1397696-738c-4295-afcd-943feb885714:13913 http://brianmurrow.com/Blog/tabid/123/ID/13912/Is-Analytics-Utopia-just-around-the-bend.aspx#Comments 0 http://brianmurrow.com/DesktopModules/DnnForge%20-%20NewsArticles/RssComments.aspx?TabID=123&ModuleID=637&ArticleID=13912 http://brianmurrow.com/DesktopModules/DnnForge%20-%20NewsArticles/Tracking/Trackback.aspx?ArticleID=13912&PortalID=0&TabID=123 Is Analytics Utopia just around the bend? http://brianmurrow.com/Blog/tabid/123/ID/13912/Is-Analytics-Utopia-just-around-the-bend.aspx As our Conversations on Big Data is a series of discussions about using analytics in creative and interesting ways, Lara Shane of the Partnership for Public Service and I speak to people who are already doing just that. Not surprisingly, these people all share a common vision of an end state where the power that quantitative analytics brings to decision making is leveraged daily through processes of discovering and interpreting complex patterns of everyday life and then capturing that knowledge. If you have ever wondered what that vision, an Analytics Utopia, is like, Carter Hewgley, Director of Enterprise Analytics at FEMA described it for us recently. Carter says “in an ideal world, everybody is capable of doing analysis. Quantitative analysis is a professional thing that you should not treat as another duty.” All citizens would understand the concepts behind analytics, which fosters an environment where “data-driven decision making” is the norm, rather than a capability starting to proliferate in innovative organizations. Even those who do not enjoy doing analytics would at least have a comfort level that helps built trust, which Carter says “is absolutely the thing that you have to establish with the people that you’re working with.” Data would be readily available through a totally invisible “behind the scenes connectivity to really powerful information” that is generated and collected through standard processes that are well defined, auditable, and timely. That data is integrated in a secure environment that leverages common standards and definitions. Unstructured data is mined for patterns that can be analyzed and ultimately mapped to those standards, which includes the veracity of each data source. Supplemental data that provides pertinent dimensions and history is robust with sufficient levels of aggregation to enable initial analyses by everyday users. Identity management tools would actively monitor, protect and enable access to data by defining common user groups and roles that apply privacy or security standards. Professional analysts use standard tools to build analytical models, acquire the data needed to fully test their models and the hypotheses they represent, run the models, and then interpret the outcomes. They use versatile and simple tools to present their findings at various levels of detail, using media suited to differing audiences. They engage with business SMEs and process owners across wide areas of function in joint “stat sessions” that are conducted in complete confidence and trust that their purpose is continuous improvement. Outcomes are transformed into knowledge that is retained and actively managed. There are no wrong answers; the outcomes of analysis contribute to learning—even if they are unexpected. Leaders invest a great deal of their own time in analytics and are expected to provide a common vision across their organization. Executives are expected to frequently review their strategic initiatives and needs against the current knowledge base and analytic programs to ensure that relevant data exists to make any possible foreseeable data-driven decision. They are also expected to lead more strategic efforts that identify knowledge gaps so that the data needed is acquired and integrated well in time to answer the questions of the medium to long term future. Business managers include support for analytics as standard budget items. They leverage past successes using analytics to justify further programs to widen or build on current knowledge in logical, incremental steps. They know when to “let go of solving the whole problem in favor of solving one particular part of the problem”, and when digging deeper is justified. Process owners insists on simplification, standardization and re-use of shared process so that process performance can be readily measured, evaluated, and the process tweaked if needed. Does this sound like your idea of Analytics Utopia? Then you may be interested in knowing Carter Hewgley’s top three pieces of advice for building it from scratch: 1.       Find a Champion: Carter says the first step is to find someone “at a leadership level in your organization that does care about data. Find them and make sure that they are championing your cause and that you can leverage their belief in this to further it.” 2.       Assess the Culture: You have to know what your starting point is. According to Carter “none of us has the luxury of pretending … if they’re not into it, then you need to have a different strategy than if they are on board.” 3.       Aim for a Quick Win: “Pick a problem that they care about that you have data … and show them really quickly that, hey, if you did this differently you could” get a better outcome. The value of that outcome can then be associated with the analytics and justify the next step. Check out past Conversations on Big Data for more tips on how to build your own Analytics Utopia. ___ This article represents the views of the author only, and the information contained herein is of a general nature and is not intended to address the circumstances of any particular individual or entity. No one should act on such information without appropriate professional advice after a thorough examination of the particular situation. Brian Tue, 06 Jan 2015 00:32:00 GMT f1397696-738c-4295-afcd-943feb885714:13912 http://brianmurrow.com/Blog/tabid/123/ID/13911/Everything-you-need-to-know-to-build-a-successful-analytics-program-you-learned-in-kindergarten.aspx#Comments 0 http://brianmurrow.com/DesktopModules/DnnForge%20-%20NewsArticles/RssComments.aspx?TabID=123&ModuleID=637&ArticleID=13911 http://brianmurrow.com/DesktopModules/DnnForge%20-%20NewsArticles/Tracking/Trackback.aspx?ArticleID=13911&PortalID=0&TabID=123 Everything you need to know to build a successful analytics program you learned in kindergarten http://brianmurrow.com/Blog/tabid/123/ID/13911/Everything-you-need-to-know-to-build-a-successful-analytics-program-you-learned-in-kindergarten.aspx For several years, the Partnership for Public Service and the IBM Center for The Business of Government have partnered to have Conversations on Big Data, a series of discussions designed to broaden the perspective about quantitative analytics and share lessons learned about what worked well and what did not. As I was reviewing the transcripts of some of our “Conversations” to prepare to write this blog, it occurred to me that although the data analytics techniques are often complex, some of the lessons learned that really makes an analytics program successful are rather straightforward.   As an example, let’s look at the Conversation Lara Shane and I had recently with Malcolm Bertoni, the Assistant Commissioner for Planning at the FDA. While analytics has always been used at the Agency in the context of drug trials, the Prescription Drug User Fee Act of 1992 (PDUFA) mandated a partnership between FDA and drug companies in order to fund the work required to improve the agency’s performance evaluation process. The results were successful and PDUFA has been renewed several times and performance management analytics has expanded over the last 14 years throughout the Agency. Here are some of the ‘tips’ Malcolm shared that are all things we learned in kindergarten. You must share with others. While professional data analysts are necessary to build analytic logic models and algorithms, they are not intimately familiar with the data to which those algorithms are applied. The analysts have to turn to the people who use the data on a daily basis. Malcolm pointed out that “people do feel some ownership of their data ... It is kind of human nature that they’re responsible and accountable for their program, and they don’t want people getting into their business who don’t understand it.” When asked how he helps folks get over that fear of sharing, Malcolm said that building good relationships helps, but that there also need to be a “top-down authority” figure to point to. The authority figure (or teacher) has the power to reward the subject matter expert for sharing and to punish those who do not share. You must learn to work with others. An important component of most analytics programs is that the data involved is created or hosted, or both, in multiple business units across an organization. Malcolm and his team had to look at “processes or issues that cut across the organizational lines, so there wasn’t any one owner.” Malcolm and his team had to “bring together all the people from the different programs … and start measuring it and having the quarterly discussions about it.” These discussions were also used to disseminate “best practices and different techniques for recruiting and managing the process and the cycling that goes in and out.” Malcolm and his team had to lead the participants from across the FDA in the process of sharing ideas as well as data and working together to find the best solution. Creativity is fun and important. Malcolm says “we want to be innovative. We want to look at the new technologies and opportunities and be creative and at the same time be grounded in what’s practical. And if we can kind of get the best of both worlds, I think there’s some really exciting opportunities in this field. Easy problems get solved. It’s the complex problems, particularly in this day and age, that tend to require … different thinking perspectives.” Designing and building successful quantitative analytics programs often demands “out of the box” thinking. There is no such thing as a “bad” idea and all options should be considered –none rejected without review. Do not stop asking why. Most of us have had those conversations with children where every response generates another instance of the question “Why?” Children are always questioning because they have not yet learned to be embarrassed by not knowing the answer. Of course, this could by why children learn so much at an early age! This reluctance to admit ignorance must be conquered for an analytics program to succeed. The entire point of quantitative analytics is to find the answers to a specific question or sets of questions. People may think they know the answers, and sometimes the “gut feeling” does represent knowledge, but those gut feeling answers must be backed up by findings and results that tie back to metrics and numbers, if they are to be accepted by an organization. Another key area where it is critical to question is data quality. Malcolm compares his analytics team to a “quality assurance function that independently looks at data. But I think the most important thing to improve quality is to use the data … not just reporting it, but actually having conversations with the managers and asking questions, gee, I noticed there seems to be something jumping up here; why is that? A lot of times, they’ll know the answer, and other times, they’ll say, thanks for asking; we went and looked into it, and we found an anomaly, or there was a problem with the data. So engaging in that dialogue – and you know, that’s really the way to assure good quality data.” Connect the Dots. Many of us will remember doing exercises when children where we had to “connect the dots.” Malcolm warns that it’s not enough to know the beginning questions and the end results of an analytics program. There are lots of “dots,” such as processes, integration points, etc., that have to be thoroughly understood or the integrity of the results may be questionable. He said “I’m looking for these intermediate impacts that still have a pretty direct line of causation.” Each one of these “intermediate” impacts must be identified, defined, and assessed, and then connected to the next to understand the overall picture. Keep on Learning. In the end, Malcolm says that his mission at the FDA is “really about being a learning organization and a very intentional goal-seeking organization that’s trying to do the best for the public health with the resources and tools that you have.” The success of quantitative analytics at the FDA was the result of bringing a group of people with varying skills and knowledge to share resources, work together for a common goal by being creative, asking questions, discovering the dots and then connecting them in order to learn as much as the can and hopefully, enjoy the experience enough to want to tackle the next set of questions. For the complete Conversations on Big Data blog series,please click here. ___ This article represents the views of the author only, and the information contained herein is of a general nature and is not intended to address the circumstances of any particular individual or entity. No one should act on such information without appropriate professional advice after a thorough examination of the particular situation. Brian Tue, 16 Dec 2014 00:05:00 GMT f1397696-738c-4295-afcd-943feb885714:13911 http://brianmurrow.com/Blog/tabid/123/ID/13221/Leveraging-quantitative-analytics-in-the-legal-profession.aspx#Comments 0 http://brianmurrow.com/DesktopModules/DnnForge%20-%20NewsArticles/RssComments.aspx?TabID=123&ModuleID=637&ArticleID=13221 http://brianmurrow.com/DesktopModules/DnnForge%20-%20NewsArticles/Tracking/Trackback.aspx?ArticleID=13221&PortalID=0&TabID=123 Leveraging quantitative analytics in the legal profession http://brianmurrow.com/Blog/tabid/123/ID/13221/Leveraging-quantitative-analytics-in-the-legal-profession.aspx The longest-running television crime drama was Law and Order, hitting the airwaves for a consecutive twenty seasons. I would argue that the effectiveness of the series had a lot to do with the balance of airtime between the “law” and the “order” side of the criminal justice system. Therefore, as a continuation of my blog series, Conversations on Big Data, I am picking up on a theme that was started in prior blogs where I discussed how real-life police – the “Order” half of the legal system – use quantitative analytics as key evidence that convinces the jury to send the crooks away for a long, long time. Today’s blog addresses the other half of the system – the “Law” – expanding on a conversation that I had with Gerald Ray, the Deputy Executive Director of the Office of Appellate Operations at the Social Security Administration. Gerald’s interest in applying analytics to the law started when he was in law school. He explains to us that his professors “would have us read cases and then extract some maxims, … as if there was logic to it all.” He continued on to say, “but what I also saw is … divergent opinions from different judges.” A sitting judge has demonstrated expertise as a result of years of training and experience. Yet, different judges, at similar levels of the judicial system and having very similar training and experience can and do render different outcomes. Gerald discovered that sometimes different outcomes happen because of “process errors.” He concluded that “if we could be a little more systematic, we could … give better feedback and change the behavior.” As a result of his hypothesis, Gerald spearheaded a program of continuous improvement, applied specifically at SSA to policy compliance. This resulted in increasing the performance metrics as the team’s process improvements enabled them to do more work. I provide more details on this in one of my prior blog posts, “Applying Analytics to Continuous Process Improvement.” This demonstrated program success paved the way for the expansion of analytics as a legal tool beyond the process of policy compliance. Gerald explored techniques such as natural language processing (NLP), text-mining, and predictive analytics to leverage the SSA’s other non-structured data types. As a result, Gerald initiated a program that applies K-means clustering techniques to mine claims data and see if there is any relationship in process errors. This early initiative has already given Gerald enough evidence to cite a correlation with the expression of “homicidal ideation” (remember, this was a conversation with a lawyer) and cases associated with drug or alcohol problems. Gerald says, this “small example” is important, because the legal issues generated by the threatening behavior might be prevented. “We’d rather not push them down that path because they really can’t control necessarily what they’re saying.” Gerald says that there are 3 important components needed to leverage quantitative analytics in the legal profession. 1.       Problem 2.       Data scientists and subject matter experts 3.       Solution The solution, “which is often in the data”, typically requires change in process and behavior. Gerald says that “it’s easier for me to find the problem than to get people to change their behavior.” Successful programs are able to work with staff so that the numbers help to make the “solution” easier to understand and more intuitive. For example, visualization tools are incredibly useful to demonstrate “that it’s better to do it this way than that way”, Gerald says. Furthermore, he uses heat maps to publish policy compliance monitoring reports, noting that “when people see rows and columns and numbers, their eyes glaze over unless they’re accountants.” “But if I can show patterns from the visualizations … if something is different and it jumps out in the visualizations – using, color and things to make it pop – then people can see that instantly.” This results in a higher degree of buy-in. Sounds like a session in the situation room of whatever your favorite legal drama show, doesn’t it? In my prior blog on my conversation with Gerald, he talked about the importance of feedback for “Applying Analytics to Continuous Process Improvement.” And by sharing that feedback with stakeholders, through visualizations, one can dramatically impact change by encouraging conforming to an optimal process or outcome. Another influence on Gerald is historian James Burke, who teaches that all events are ultimately interconnected. For example, the steam engine was invented in the mining industry and its success there triggered the Industrial Revolution. Gerald feels that using data analytics and mathematical techniques and tools in the legal system is another example of a Burke-like interconnection, which we see in the growing use of video, audio, and image data in criminal investigations and prosecutions. For the complete audio interview, please visit: http://ourpublicservice.org/OPS/events/bigdata/ ___ This article represents the views of the author only, and the information contained herein is of a general nature and is not intended to address the circumstances of any particular individual or entity. No one should act on such information without appropriate professional advice after a thorough examination of the particular situation. Brian Fri, 14 Nov 2014 22:08:00 GMT f1397696-738c-4295-afcd-943feb885714:13221 http://brianmurrow.com/Blog/tabid/123/ID/10622/Applying-Analytics-to-Improve-Organization-Operational-Performance--Insights-on-from-OMB.aspx#Comments 0 http://brianmurrow.com/DesktopModules/DnnForge%20-%20NewsArticles/RssComments.aspx?TabID=123&ModuleID=637&ArticleID=10622 http://brianmurrow.com/DesktopModules/DnnForge%20-%20NewsArticles/Tracking/Trackback.aspx?ArticleID=10622&PortalID=0&TabID=123 Applying Analytics to Improve Organization Operational Performance -- Insights on from OMB http://brianmurrow.com/Blog/tabid/123/ID/10622/Applying-Analytics-to-Improve-Organization-Operational-Performance--Insights-on-from-OMB.aspx As a continuation of my podcast series, Conversations on Big Data, I sat down with Lisa Danzig, Associate Director for Personnel and Performance, at the Office of Management and Budget (OMB), to discuss her experience using analytics at HUD and OMB to increase operational performance. These podcast conversations are designed to broaden the perspective about quantitative analytics, share lessons learned about what worked well and what did not, and provide specific examples of benefits gained. Prior to joining OMB, Lisa was Director of Strategic Planning Management at HUD, where she was instrumental in leveraging program performance data through the HUDStat program. I discussed Lisa’s experience at HUD in an earlier blog. Today, I’d like to share with you Lisa’s advice on implementing analytics programs, which are surely going to guide her plans at OMB. Organization Culture. Lisa says “so I think the first thing I learned is that this is really hard. I think one of the reasons that I was attractive for the role at OMB … is because I’ve had the experience … implementing it versus talking about it.” During our conversation, Lisa shared many of the challenges involved in implementing analytics – many of which are cultural – and strategies to help move the culture toward embracing analytics. One of those challenges, for example, is to get people to realize that “failure is OK” in the context of continuous learning and improvement. A strategy that Lisa used at HUD to address that cultural shift is promoting a strategy of achieving a “good” – but not necessarily “perfect” result. Data Quality. Another important tip Lisa shares is “that it’s important to start with the data you have.” “Starting the conversation starts to shed light on the information that you have and improve it inherently just by shining a spotlight on it.” She states quite clearly that, while data quality and data lag are big challenges, the most effective solution is to roll up your sleeves and do what it takes to address these data issues. “It takes some muscle, and it’s a little tedious, but I think you can make it happen.” She advises that it will take time to figure out what the data issues are, implement a resolution, and evaluate results, but it will ultimately pay off. Collaboration. Collaboration is another critical element for building a successful analytics program. Lisa emphasizes the “importance of building trust so that people feel they are invited to a collaboration about problem-solving and not like they’re on the hot seat and need to be defensive.” She describes a “fine line” between accountability as far as meeting goals and trust that elicits candid discussions of challenges and shortcomings. Without the trust element, people are less forthcoming and less willing to be creative. There are also situations where collaboration with a partner is required. These partners include: •             Intra or inter-agency partners to help provide data or to accomplish a specific business goal •             Industry partners that can be leveraged for support regarding best practices •             Technical partners that can help address technology gaps Business Value. Analytics provide more value when there is focus on a clearly articulated business problem. To illustrate this, Lisa told me a joke about “the man who lost his keys. He was looking for them on the ground, and someone came along and said, did you lose your keys? And he said yes. So she said, well did you lose them here? And the man said, no, I lost them over there, but this is where the light is.” She has found that organizations tend to collect data in large repositories without any particular reason why or standards about what the data applies to. “So I think starting with what’s your problem or goal … is fundamental to making this relevant and useful.” Even when it comes to Big Data, Lisa is a fan of keeping it simple, even as she recognizes the powerful potential of what can be accomplished by leveraging Big Data. Celebrate Success. Last but not least, Lisa says, is “the importance of celebrating success.” Analytics programs can be so large, complex, and time-consuming that the sheer amount of work to be done can be overwhelming. By just “finding a couple of examples where people are modeling … behaviors, using data to make decisions, and it doesn’t feel so overwhelming”. Success should be clearly defined and strategy needs to ensure that each iteration of work provides increasing increments of success. To hear the full interview with Lisa or our interviews with other leaders in big data analytics, please visit Conversations on Big Data. ___ This article represents the views of the author only, and the information contained herein is of a general nature and is not intended to address the circumstances of any particular individual or entity. No one should act on such information without appropriate professional advice after a thorough examination of the particular situation. Brian Mon, 03 Nov 2014 18:56:00 GMT f1397696-738c-4295-afcd-943feb885714:10622 http://brianmurrow.com/Blog/tabid/123/ID/10621/Three-Key-Ingredients-to-Build-an-Investigative-Analytics-Unit-Lessons-from-the-RAT-Board.aspx#Comments 0 http://brianmurrow.com/DesktopModules/DnnForge%20-%20NewsArticles/RssComments.aspx?TabID=123&ModuleID=637&ArticleID=10621 http://brianmurrow.com/DesktopModules/DnnForge%20-%20NewsArticles/Tracking/Trackback.aspx?ArticleID=10621&PortalID=0&TabID=123 Three Key Ingredients to Build an Investigative Analytics Unit – Lessons from the RAT Board http://brianmurrow.com/Blog/tabid/123/ID/10621/Three-Key-Ingredients-to-Build-an-Investigative-Analytics-Unit-Lessons-from-the-RAT-Board.aspx As a data scientist, I of course love the police dramas that use analytics to derive key evidence that put the crooks in jail for a long, long time. Well, a few weeks ago, as a part of our ongoing series, Conversations on Big Data, I had a brief window into just such an operation when I sat down with Steve Beltz, Assistant Director of the Recovery Operations Center of the Recovery, Accountability, and Transparency Board (fondly known as RAT). The RAT Board was originally created by the American Recovery and Reinvestment Act of 2009 (ARRA) to provide transparency of ARRA-related funds and detect and prevent fraud, waste, and mismanagement of those funds. Later the RAT Board authority was expanded to include oversight of all federal funding. Steve has been working for the public sector for over three decades, mostly in law enforcement as a “detective and reconstructionist”.   Steve told us that he “was the first person in the Washington State Patrol that started the computer forensic unit.” “I would get a 100 MB disk.” “DOS was my tool … we’d walk sector by sector through a disk and analyze it.” While technology has a come a long way, Steve says “you can have the best data in the world and the best tools in the world, but if you don’t have the right person that knows how to ask the right questions, you’ll get nothing.” He goes on to say that “somebody has to know how to understand what the answer is, and then dig deeper”.  It takes a special type of skill to look beyond the numbers and the technology and take that extra step to dig deeper into the evidence. Steve shared a story with us that illustrated his detection through analytics approach.  He was teaching a class to the Federal Law Enforcement Training Center, when one of the agents in his class reminded him that they had met in the field.  The agent recounted that “I was with the prosecutor … interviewing the defendant, and the defense attorney … sitting across the table, the typical thing. They let the defendant talk and explain … and [he] was way off in right fields, with his conversation and not even close to what … we’ve already dug up. So it finally got to the point where the prosecutor took out a chart that we had made, the link chart of all his relationships and who he’s working with, and where the money is going, and kind of slid it across the table and said, well you’ve said this, but this is what we found out. And they took an immediate recess for about 10 minutes and came back and decided … to plead guilty.” In describing the most important ingredients for a successful analytics program, Steve used a 3 legged stool to describe his approach. He feels that is is necessary to have: 1.       Good Data  2.       Good Tools 3.       Good Analysts Steve says these are all critical components without which, “you’re going to sell yourself very short.” He goes on to say that “acquiring good data is a common problem that will never go away so long as shared data is in use.” Steve’s experience has been that most agencies struggle with obtaining good tools, and even better analysts.” On the other hand, “tools are getting very good, and conversely so, very complicated.  The analysis process itself … is a full-time job and a lot of agencies are having … people wear multiple hats.  So you’ve got an agent that’s not only out investigating but also … trying to come back and do the analysis as well.” While the RAT Board will be winding down in the near future, Steve acknowledges that the results of its work are still pending.  The Recovery Ops Center is currently supporting over 200 cases of alleged mismanagement of funds annually.  Steve’s analysts have yet to testify in a court, but he is training them for that event. In Steve’s view, “That’s not really what we’re there for.” He feels that his job is to “build intelligence packets.”  He is then much happier if his intelligence or “prosecution packets” result in a guilty plea rather than needing to be used in a trial.  To learn more about Steve’s approach, listen to my complete interview with Steve at http://www.ourpublicservice.org/bigdata/. And keep an eye out for more about the RAT Board in future blogs. ___ This article represents the views of the author only, and the information contained herein is of a general nature and is not intended to address the circumstances of any particular individual or entity. No one should act on such information without appropriate professional advice after a thorough examination of the particular situation. Brian Mon, 27 Oct 2014 17:07:00 GMT f1397696-738c-4295-afcd-943feb885714:10621 http://brianmurrow.com/Blog/tabid/123/ID/10456/Using-Big-Data-Analytics-for-Effective-Financial-Market-Oversight--The-Three-Essential-Ingredients.aspx#Comments 0 http://brianmurrow.com/DesktopModules/DnnForge%20-%20NewsArticles/RssComments.aspx?TabID=123&ModuleID=637&ArticleID=10456 http://brianmurrow.com/DesktopModules/DnnForge%20-%20NewsArticles/Tracking/Trackback.aspx?ArticleID=10456&PortalID=0&TabID=123 Using Big Data Analytics for Effective Financial Market Oversight -- The Three Essential Ingredients. http://brianmurrow.com/Blog/tabid/123/ID/10456/Using-Big-Data-Analytics-for-Effective-Financial-Market-Oversight--The-Three-Essential-Ingredients.aspx Over the past few months, I have co-hosted Conversations on Big Data, a series of discussions about using analytics in creative and interesting ways. Today’s Conversation is with Lori Walsh, the Chief of the Center for Risk and Quantitative Analytics (the Center) for the Securities Exchange Commission’s (SEC) Division of Enforcement. The SEC has several analytics programs that are structured in a “hub and spoke system.” Lori’s Center sits at the hub to “centralize the information and determine how to share those techniques and tools generally”. Lori says that the “main part of my job entails proactive identification of … violations of the securities laws. And so I focus on data, analytical tools, and techniques to help identify violations … more quickly”. Previously, Lori ran the Office of Market Intelligence for the Division of Enforcement, and this background prepared her well for her role with the Center. She cited the management of the tips and complaints process as an example of this preparation. SEC receives approximately 20,000 tips annually that are documented, profiled, and then evaluated against several criteria such as credibility, significance, and risk. Lori told me “Seeing all of these tips come in day after day … made me see a pattern.” From these patterns, Lori says she that she now tries to “go into the data and identify things before we get a tip.” “That’s been really critical in how we’ve structured the Center“. She says her method is to use data mining techniques to identify patterns in the data and correlate them to “violative” activity. Lori shared her three essential elements for a good analytics program. ·         Data ·         Infrastructure ·         Subject Matter Expertise Data is not an issue for the SEC, which consumes a huge volume of data that is further processed, analyzed, and enhanced by regional offices sitting at the end of the various analytic “spokes” referred to above. However, Lori is very excited about technical advances in data integration. All that data, raw and derived, that SEC consumes is collected at the Center. She does not worry about data quality because she actively manages to it. “A lot of people think we don’t have enough data available.” “I say we’ve got way too much data available to us.” “We’ve got to figure out what data is needed to answer a question.” She goes on to say that “being tripped up by poor-quality data is a slightly different issue.” “You want to get the data as clean as possible.” But then you have to “caveat the output … based on the limitations of the data”. As for Infrastructure, Lori describes her current integration process as “somewhat laborious and cumbersome”. She says “it’s a way of pulling together pieces of the puzzle … and you are able to see connections among the data putting pieces of the puzzle together.” She is starting to use tools that will not only automate and facilitate the actual integration process, but also “map it for you … using icons or histograms or a timeline so that you can see the data in lots of different ways.” She says that “data visualization is fairly new for us [and] is exciting.” Lori says that she relies heavily on Subject Matter Experts to supply her with questions to apply to her large repository of data. The Division of Enforcement staff, including attorneys, accountants, and investigators, is well trained and experienced. Lori says “they know what fraud looks like, but they don’t necessarily know how to take that information out of their head and put it into an algorithm or data or analytics.” “We try to get the information out of the experts’ heads, identify patterns, identify data that we can use to apply the patterns to, and then filter the universe of potential behavior to the ones that are most likely to be high-risk.” Lori says she was taught, as an empiricist, the first thing you need is a theory to test. I asked Lori to tell us her definition of success. She says the “ultimate goal is for the Center to be more efficient, faster at identifying ‘violative’ activity. And if we identify something before we get a tip in, that’s a success.” She cites an example that occurred recently where the Center used a risk-based analytics process to identify a potentially fraudulent offering. It was referred to an investigative group within Enforcement. Two days later a tip came in on the very same offering. “That’s a success for us.” Brian Mon, 20 Oct 2014 19:11:00 GMT f1397696-738c-4295-afcd-943feb885714:10456 http://brianmurrow.com/Blog/tabid/123/ID/9914/Analytics-Need-to-Provide-Business-Value-a-Conversation-with-the-IRS-on-Big-Data.aspx#Comments 0 http://brianmurrow.com/DesktopModules/DnnForge%20-%20NewsArticles/RssComments.aspx?TabID=123&ModuleID=637&ArticleID=9914 http://brianmurrow.com/DesktopModules/DnnForge%20-%20NewsArticles/Tracking/Trackback.aspx?ArticleID=9914&PortalID=0&TabID=123 Analytics Need to Provide Business Value – a Conversation with the IRS on Big Data. http://brianmurrow.com/Blog/tabid/123/ID/9914/Analytics-Need-to-Provide-Business-Value-a-Conversation-with-the-IRS-on-Big-Data.aspx As a part of my ongoing conversation with senior executive leaders, Conversations on Big Data, I recently sat down with Dean Silverman, Senior Advisor to the Commissioner, Office of Compliance Analytics, at the Internal Revenue Service (IRS). These Big Data conversations tend to focus on best practices in using big data and analytics in creative and interesting ways to support real-world business goals. Dean joined the IRS in 2011 to build an advanced analytics program. According to Dean, for analytics to be relevant, analytics programs have provide practical outcomes and measurable business value. “There’s an endless amount of data…but what good does it do for me [if it does not lead to] any meaningful, actionable strategy?”  Furthermore, he believes that “analysis is important … but if it’s not overlaid with problem-solving, you’re going to spin around and answer the wrong question”. Since the capabilities needed for analytics often require considerable investment, a strategic goal associated with specific outcomes provides a framework to measure that future return on the required investment. In developing the IRS’s Office of Compliance Analytics (OCA), he describes the team as: 1.       Empowered to ask the hard, sometimes uncomfortable questions 2.       Helps structure and participate in major change programs 3.       Support the core business in framing the problem 4.       Uses data that exists to find a solution Dean feels strongly that an analytics program depends on executive sponsorship, to provide the above “secret sauce”. Those champions need to engage key stakeholders to identify the problems that are currently keeping the organization from accomplishing its goals now, and promote the inclusion of “analytics as a way of doing business”. Specifically, this approach helped to frame Dean’s primary objectives for analytics: 1.       Reduce fraud and improper payments, primarily through Identity Theft and the Earned Income Tax Credit (EITC) 2.       Reduce the tax gap 3.       Identify core compliance enhancements Dean and his team started by reaching out to the operating and IT divisions to “partner hand and glove … in agreeing on what the problem is, solving it, [and] …then measuring the results.” He considers this the “embedded partner approach”. To be successful, an outsider who has been “embedded” in the organization by a senior leader needs to create a sense of mutual trust. It is critical to convince the internal partners that you are there to help focus on their results by leveraging their expertise and understanding of current process and available data. In our interview, Dean provides two examples on how the embedded partnerships within an organization work successfully. To learn more about Dean’s successes, listen to my complete interview with Dean at http://www.ourpublicservice.org/bigdata/. Brian Mon, 06 Oct 2014 23:46:00 GMT f1397696-738c-4295-afcd-943feb885714:9914 http://brianmurrow.com/Blog/tabid/123/ID/9813/An-Iterative-Approach-to-Maturing-Strategic-Analytics--FEMAStat.aspx#Comments 0 http://brianmurrow.com/DesktopModules/DnnForge%20-%20NewsArticles/RssComments.aspx?TabID=123&ModuleID=637&ArticleID=9813 http://brianmurrow.com/DesktopModules/DnnForge%20-%20NewsArticles/Tracking/Trackback.aspx?ArticleID=9813&PortalID=0&TabID=123 An Iterative Approach to Maturing Strategic Analytics - FEMAStat http://brianmurrow.com/Blog/tabid/123/ID/9813/An-Iterative-Approach-to-Maturing-Strategic-Analytics--FEMAStat.aspx Last month, I sat down with Carter Hewgley, Director of Enterprise Analytics at the Federal Emergency Management Agency (FEMA), to discuss using big data and analytics in a number of creative and interesting ways to support real-world business goals. This conversation was a part of the ongoing IBM/Partnership for Public Service Podcast series, Conversations on Big Data. Before arriving at FEMA, Carter had broad experience bringing analytics to various agencies in the District of Columbia municipal government, and before that, he was a financial analyst.  He says that he has always “had sort of a data focus… not because I’m great at it…but I just believe in facts.”  When Carter joined FEMA in 2011, he realized that the organization was focused on timely delivery of services and the processes required collect and organize all the resources to support those services.  Although there were “analytical cells” across the agency and programs, enterprise-level analytical capability was immature. FEMA was a “disaster-driven” organization, more focused on getting in better next time than looking at how it did last time. It took some executive sponsorship from a senior executive who had been impressed by Boston’s 911 response analytics, to promote acceptance of performance measures, as value-add activities that provide genuine insight. Carter says that Building FEMAStat was a “three-year, iterative, ongoing, learning experience”.  Year 1, Get Started:  “Grab whatever data you can find, throw it up on the wall, and match it to the right decision at the right time to sort of make it the most relevant and important sort of value-added experience for those who are using it.” Carter started out by reaching out across FEMA to discover how program and department managers were currently managing performance reporting. He encouraged collaboration with the people who understand the data, to document and publish it. Simultaneously, he initiated discussions with senior management to identify key strategic focus areas or business questions. The important task is to continue to document, tailor, and refine available data so that you can figure out what it tells you from a “bottom up” perspective and then relate that data to the strategic focus areas for the “top down” perspective. Then you can start to establish benchmarks that provide context. In retrospect, Carter wishes he had also used that time to build more of an integration infrastructure, but also feels that data quality is not relevant – at first. Cleansing the data can be applied once you know what you’ve got. Another lesson learned in Year 1 is that when initiatives compete with disasters, disasters win. Carter happened to be in New York working on data collection and analysis when Hurricane Sandy struck.  Working with local FEMA officials, he realized there was a need for analytics on site, where rescue and recovery activities were being directed. Having to deploy resources remotely from FEMA HQ would not have been optimal. As a result, Carter defined the concept of mobile field analytics as a focus for Year 2. Year 2, Focus on Strategic Initiatives:  “The second year was more collaborative [and included measuring] ... how we are all performing on some … strategic initiatives.”  Carter says that a decision was made to focus attention on fewer strategic initiatives, in order to more sharply define the initiatives’ value-add. There were “stat sessions” that examined common processes. When staff stopped to define their processes, they were amazed at the complexity and readily came together to model streamlined, more efficient processes to measure. Not surprisingly, a big challenge was letting go of the need to solve “whole problem in favor of solving a particular part of the problem”. Year 2 was also a time for beginning to build out of the field office capability. This was accompanied by a growing sense that more “flavors” of metrics were needed. Carter realized that, in addition to field performance statistics, other areas such as resource allocation, finance and budget, and capability building should be proposed. Year 3, Simplicity and Sustainability:  “ . . . what we’ve tried to do is focus in on one specific problem and really pin it down . . . I think when you narrow it to the right question that is high priority enough and strategic enough that senior leaders really care about it but it’s specific enough to where you’re not boiling the ocean, I think you really hit a sweet spot”. Year 3 was spent considering the strategic questions for further focus and further refining the data and capabilities to support the types of analytics FEMA needs. And as a result, a new team organized according to those categories. In conclusion, we asked Carter to name his key ingredients for analytics success: ·         Executive sponsorship ·         Leadership and analytics expertise ·         Data and people who understand the data ·         Communication, curiosity, and collaboration ·         Business value, for your executive sponsor to use for ROI and to justify investment Brian Mon, 29 Sep 2014 13:39:00 GMT f1397696-738c-4295-afcd-943feb885714:9813 http://brianmurrow.com/Blog/tabid/123/ID/10129/Big-Data-Analytics-Career-Opportunities.aspx#Comments 0 http://brianmurrow.com/DesktopModules/DnnForge%20-%20NewsArticles/RssComments.aspx?TabID=123&ModuleID=637&ArticleID=10129 http://brianmurrow.com/DesktopModules/DnnForge%20-%20NewsArticles/Tracking/Trackback.aspx?ArticleID=10129&PortalID=0&TabID=123 Big Data Analytics Career Opportunities http://brianmurrow.com/Blog/tabid/123/ID/10129/Big-Data-Analytics-Career-Opportunities.aspx In this blog space, I typically discuss various leading edge examples of how public sector and financial services organizations are using Big Data Analytics to advance their organizations’ mission. Today, I am going to invite folks to be a part of the excitement. At IBM, my team is doing extensive hiring in all areas of Big Data Analytics and are looking to hire people of all levels of experience in the following areas: InfoSphere BigInsights Developer(Hadoop) Senior (ETL) Datastage Specialist Datastage (ETL) Specialist (MidLevel) Big Data Solution Architect Watson Explorer (Vivisimo) Developer Netezza/Pure Analytics/Greenplum Developer Senior Advanced Analytics (SAS/SPSS/R)  So if you or someone you know is looking to make a difference, challenge the status quo, and build expertise using leading edge Big Data Analytics tools and techniques, please click one of the links above, explore the opportunities, apply, and drop me a note. Brian Fri, 26 Sep 2014 14:21:00 GMT f1397696-738c-4295-afcd-943feb885714:10129 http://brianmurrow.com/Blog/tabid/123/ID/9812/Analytics-driven-Organizational-Cultural-Change.aspx#Comments 0 http://brianmurrow.com/DesktopModules/DnnForge%20-%20NewsArticles/RssComments.aspx?TabID=123&ModuleID=637&ArticleID=9812 http://brianmurrow.com/DesktopModules/DnnForge%20-%20NewsArticles/Tracking/Trackback.aspx?ArticleID=9812&PortalID=0&TabID=123 Analytics-driven Organizational & Cultural Change http://brianmurrow.com/Blog/tabid/123/ID/9812/Analytics-driven-Organizational-Cultural-Change.aspx As a continuation of my IBM/Partnership for Public Service Podcast series of discussions about using big data and analytics in a number of creative and interesting ways to support real-world business goals, Conversations on Big Data, we recently discussed with Malcolm Bertoni, Assistant Commissioner for Planning at the Food and Drug Administration, the important elements of a successful analytics program and a road map for getting there. By way of background, in the late 1980’s and early 1990’s, the Food and Drug Administration faced a mountain of criticism. It was thought that the public health safety precautions built into its drug evaluation procedures in reaction to the Thalidomide tragedy two decades earlier were responsible for delaying consumers’ access to vital new drug therapies. Particularly in light of the growing activism around fighting AIDS, critics argued that the FDA procedures were born out of disaster and therefore extremely overcautious. In response, Congress passed The Prescription Drug User Fee Act of 1992 (PDUFA) which enables FDA to charge user fees to drug companies in order to fund the work required to improve the agency’s evaluation process performance. The legislation also required that the FDA report to Congress regularly the agency’s progress in meeting the goals. According to an article by Doug Schoen published 5/4/2012 on FORBES.com:  “The PDUFA user fee program has been a great success over the past two decades, as the increased funding from the pharmaceutical companies has allowed the FDA to . . . reduce the drug review process by nearly half. A 2002 Government Accountability Office report found that the average approval time dropped from 27 months to 14 months over the first eight years of the act, and user fees increased new drug reviewers by 77 percent.” Malcolm Bertoni says that success came about because FDA achieved “the whole culture of managing to particular goals that the managers really care about”. Organizational and cultural changes are always challenges. Here are some observations Malcolm shared with Laura Shane and me recently regarding how FDA accomplished it. At the beginning, there was some push back when we engaged various program areas to collaborate on how to improve performance. Fortunately, there was very clear Executive Sponsorship; the law requires regular reports to Congress and the President that were scrutinized very closely by activists at the beginning. That need for transparency, as well as the fiduciary aspect of the user fee structure also imposed a mutual responsibility between the FDA and pharmaceutical companies that fostered collaboration from a business perspective, which resulted in the early establishment of Specific, Clearly Defined Goals. FDA then built on cross-agency Partnership by building on the similarities between drug and medical device trials to performance management. Both sets of processes involve collecting huge volumes of data and evaluating that data against specified criteria. The definition of metrics, whether that metric is the number of successes within a drug trial cohort or the numbers of drugs approved per year, is a scientific process with which FDA staff are very familiar. In fact, there were many performance management processes already in place within the various agencies, of which staff were somewhat protective. Malcolm says that the FDA made sure to emphasize the real goal of instituting performance management – ensuring that the American public has the fastest possible access to advances in medical science and treatment of disease – and to emphasize that it is NOT in any way punitive. The scientific methodologies already in place at FDA helped internal stakeholders see the value of standardization and to create sets of well-defined processes that both enable better delivery of the mission and as ever improving performance management. Once the stakeholders were all engaged and some initial standards and processes defined, the process of defining Key Performance Indicators began. Malcolm says that there needs to be a combination of Top Down and Bottom Up contributors to develop a strategy for performance improvement. The measures needed to be both efficient and relevant. Efficient measures are clearly supported by data that can be captured and analyzed relatively easily. Relevant measures are easily understood in the context of the process they are applied to. According to Malcolm, without the involvement of the process actors at the agency level, the importance of some measures can be overlooked. By discovering the relationships between the processes themselves and performance measures, a set of KPIs are developed that are: ·         Clearly Defined ·         Aligned with Strategic Goals ·         Easy to Measure ·         Relevant to the Process ·         Attainable According to Malcolm, once it became clear that the performance management initiative was more about defining goals than assigning blame the FDA staff “discovered the power of actually measuring and tracking something . . . all of a sudden, they get an ‘A HA!’ and they can see . . . the value of applying the same scientific principles used in application review or risk analysis to actually managing the program and the resources.” That culture shift from silos of ownership to collaboration across organizational lines with real acceptance of performance goals as part of the public health mission is the main enabler for FDA’s success in using analytics to measure and improve performance. As proof of this success, Malcolm cites FDA-TRACK. “I invite people to go to the FDA website and search for FDA-TRACK . . . because it’s kind of all of the above in the sense that it is a visible external portal into FDA performance . . . what FDA does and how we do it”. Malcolm describes FDA-TRACK as a ‘regular conversation about performance’ as well as a dashboard where one can drill down through performance data. Those conversations are needed to ensure that the dashboards are relevant to the areas addressed as well as to the strategic goals of the organization. Brian Mon, 15 Sep 2014 12:31:00 GMT f1397696-738c-4295-afcd-943feb885714:9812