Follow me:
Search
Listen on:

Episode 34: The Significance of Data Quality in Market Research

Sharday Torgerson:

Welcome to the Stories and Market Research, the insightrix podcast. I’m Sharday Torgerson, the creative and digital strategist at Insightrix in Saskatoon, Canada. And your host, I’m excited to welcome our guests for this episode, Lab b Litman and Chesky Rosen Swag with cloud research in New York Lab is the co-founder and chief research officer at Cloud Research, and as a data quality fanatic lab’s main priorities to develop tools and improve quality data on online platforms. We are also joined by Cheskie, senior researcher and product scientist with cloud research. Chesky has helped cloud research develop Century, a data quality tool that uses advanced behavioral and technological bedding to make sample fraud thing in the past. Thank you both for joining me.

Leib Litman:

So thanks for joining me today. I appreciate you both taking the time out of your busy schedule to chat. Lab. I know we were kind of talking about the fact that summer we are, you know, kind of playing a little bit of phone tag. Seems like everyone’s kind of getting back together, really enjoying these you know, conferences. I know ESOMAR 75 was a couple of days ago, so connecting with old colleagues, some, some connections and yeah, it’s been been quite a summer. How’s it been for you guys?

Cheskie Rosenzweig:

Yeah thanks ar thank you so much for having us here. And yeah, the summer’s been great. Smar was great. I was I just actually came back from Smar just flew back yesterday afternoon and that was a great conference. I was part of the panel there on data quality. And it was a really fantastic fantastic discussion and this seems to be a tremendous amount of interest in the industry about data quality right now. And it’s just great to be part of the conversation. So yeah. Thank you. Thank you for having us.

Sharday Torgerson:

Great to be here. Pleasure and nice to meet you. Yeah, it’s certainly been quite busy. Apologize for all the back and forth that was necessary to

Leib Litman:

Right. <Laugh>

Sharday Torgerson:

Happy to finally be here and I, I wish, I guess I say I wish things were slowing down a little bit, but I’m also glad they’re busy

Leib Litman:

<Laugh>. Yeah, I feel that. I think we’re in the exact same position we’re all going into to q4, like, Oh, this is amazing. But we’re also like, wow, you know, the research is picking up. People are really diving into these innovative technologies and really getting an appetite again for, for some primary work. So I’m, I’m all for it. But, you know, maybe before we actually dive into this conversation about data quality is why I looped you guys in. I wouldn’t mind actually knowing a little bit more about yourselves and about cloud research. LA let’s start with you. I, as the co CEO of Cloud Research, I’d love to hear a little, a little bit more about yourself and how things at cloud research maybe got started.

Cheskie Rosenzweig:

Well, I’m my background is in experimental psychology. So I, I was an experimental psychology professor for many years, and experimental psychology is a discipline that deals with methodology and statistics districts. And I began to apply my sort of academic background to online research about 10 years ago. And we created, I, I, I created some tools along with Jonathan Robinson, who’s the other co CEO at Cloud Research, and he’s a, he’s a computer scientist. He’s professor of computer science. So the two of us kind of merged computer science and behavioral science, and we created some tools. Originally, the tools were really used by academics, and that’s how cloud research got started. So we got started in academia and as the tools became used more and more within academia, we re realized that there was a tremendous opportunity for application outside of academia in market research and in polling and various kinds of government research. And and that’s where we really started to expand beyond academia. And what we found was that across all these different industries, even though they’re very different, the issues that they tackle are different. The ways that they do things are different. Despite all these differences, there’s one commonality, and that is that they’re all experiencing data quality problems. Yep. and that’s where our tools that we created to solve those problems really really fit across the, across the

Leib Litman:

Industry. I love it. It’s it’s something I’m hearing quite often on the, on the podcast is a lot of folks are coming into the industry with a really, a great passion and they find a gap and they really, they hone in on it, and they kind of provide a tool and or a service to, to really make this industry better. You know, we’re seeing it more and more folks like yourself, just incredibly smart people, and really developing these tools and, and then finding, you know, an opportunity within, you know, areas of market research. So I find that really interesting about cloud research. I’ve been hearing a lot about you guys as well. So Chesky, how did you get started as a senior research senior researcher and products scientist, actually, that’s kind of cool at cloud research.

Sharday Torgerson:

Yeah, Well, that’s a cool title.

Leib Litman:

Yeah. Tell me more.

Sharday Torgerson:

<Laugh>. Yeah, so I, I’ve been working full time at the company for just around three years now. And I actually started getting interested in data quality a lot longer ago than that. Believe it or not, once upon a time I was lab’s undergraduate student at

Leib Litman:

Oh,

Sharday Torgerson:

And my introduction to the world of psychology research and also data quality. So took a brief detour after undergrad to go get a PhD in clinical psych, which I’m still finishing up just in the last couple of months hopefully. And

Leib Litman:

Final leg of the race, Hey, <laugh>.

Sharday Torgerson:

Yeah, exactly. But rejoined cloud research full time about three years ago to really focus on building out some of the data quality tools that the company’s been working on for years. And this product that we have called Century in particular which really kinda one of the leading products in the marketplace for quality, which I’m sure we’ll, we’ll get into a little bit, but it’s kind of what I live, breathe these days. So it’s, it’s what I talk about.

Leib Litman:

I love it. Then it clearly sounds like we’re talking to the right crew today about data quality. That’s why we’re, we’re jumping in on maybe discussing some, you know, things around sample quality online research as a whole. Right. So even within the industry, today’s online research environment, as you know, many threats to data quality and the whole validity around online market research. I know at in insights conducting, you know, online research is a huge part of our service offering as a market research firm. And undoubtedly, you know, data quality is an important topic, but also it’s, it’s an inherently entwined within the processes that we do. So I’m curious to know a little bit more about how cloud research helps organizations say, like in syri, find ways to ensure that data quality, especially maybe in online research environments like panels. You know, maybe actually before we get into some of these exciting questions, I, I wouldn’t mind setting the stage a little bit, and this is a question perhaps for, for both of you. Maybe we’ll start with lab, but what does data quality ultimately mean when we’re talking about it and, and maybe even the facet of market research?

Cheskie Rosenzweig:

Sure. That’s a, that’s a really great question. And I guess let me talk about maybe the opposite of good data quality, which is, which is flaw. So what, what is, what is bad data quality, right? So maybe you can start from there. So bad data quality really is multifaceted. There are many different things involved, but one unifying feature is that there’s some kind of in intentional circumvention of the survey process for monetary gain. So when we see that, right, we wanna try to try to remove that, but there are different kinds of fraud, right? And I’ll give you some examples.

So if people are trying to collect rewards in a, in a survey without really being qualified for the survey without making an effort to, to, to pay attention while trying to bypass as much of the survey as possible not being honest, pretending to be someone that they’re not taking surveys that they’re not qualified to take because maybe they’re in a completely different country fraudulently posing as belonging to a particular demographic group that they, they don’t belong to. Intentionally taking surveys more than once from multiple accounts, using some kind of automation to generate closed ended or open ended responses in order to bypass the, the survey. All of these are things that we see happening at a very high rate. Recent study conducted by case showed that about 30 to 40% of a typical of responses in a typical study do one of these things. And all of these are examples of bad data quality. So what is, what is data quality? What is the, what is what is good quality? Well, good quality is when you don’t have these things, right? It, it reserve it, it really is a product of a good faith attempt on the part of a participant to complete the survey questions. So that is kind of how we see what, what data quality is

Leib Litman:

That, that sounds like it is complex, and we have quite a few things on the table to, to be, or to take into consideration when we’re having a discussion around data quality. All of those things ring true when we’re, when we’re looking for recruitment, even in terms of panels looking for survey participants, say on social media. All of those red flags really play into us as market researchers really trying to make sure that our data, you know, is qualified for, for clients. So, you know, if fraud’s really a big concern online you know, what are ways that we can be addressing more systemic fraud with within the sample industry?

Cheskie Rosenzweig:

Yeah. So there are there are many things that, that people can do, and then they, they should do first there are tools out there that can help identify fraud and tools that, that can identify that someone is putting in a good faith effort into actually answering questions that are being asked of them and that the person is not trying to circumvent the system in one way or another. These tools can be really effective at identifying click farms, for example, at identifying devices that are known to contain traffic, right? That is associated with, with bad quality data. And these these tools can also be used to identify people who are inattentive, who are disengaged, who aren’t being honest and so on and so forth. But it’s really a partnership because the other thing that needs to happen is that people need to look at their own data, right? To make sure that whatever, whatever the tools are doing to clean things up, that, you know, no, no tool is perfect, right? So it becomes a partnership between the researcher themselves, right? And you know, the various, various tools that that, that they might, may be using to really make sure that the, the end result is as accurate as possible.

Leib Litman:

You know, I, I’m hearing this quite a bit as well, where we’re in this tech equilibrium in market research, We have all these great research technologies, but we also need the humans to lead it behind them. I believe this is where folks like Chesky come into play you know, where we’re looking at ways to monitor data quality and market research and beyond, you know, these are tools as mentioned, a human-centered approach, you know, what is your guys’ approach to this when we’re looking at it from a, a market research perspective?

Sharday Torgerson:

Yeah. So

Leib Litman:

Albeit monitoring DA data quality in, in market research,

Sharday Torgerson:

Yes. I, I mean, what, what LA described is ab absolutely true. You know, there are these quality tools that exist in the marketplace that are available. And then of course there’s the eyes on that researchers have in the actual data that they’re getting. And they’re both really important. And there, there are some of these tools that exist, operate very differently. I think this is what you’re, you’re getting to some are a little bit more focused validating devices and ensuring that devices aren’t fraudulent, that they’re coming from the right place. And one of the things that we really focus on is kind of looking at the, not just the device, but the potential person that’s sitting on the other side of the screen anytime a survey is being administered. And we’ve identified ways of ensuring that people who come into surveys are actually exhibiting human behaviors, and in particular that they have kind of the key respondent characteristics that good respondents need in order to be good survey takers. So making sure that people are attentive, honest, as Leigh was mentioning, some of these very basic human qualities that people need to apply to a survey in order to take it. They need to speak the language of the search, right? Basic things like that. And so we’ve been building tools that really focus not just on device, but on the humanity of the participant and ensuring that they have these basic characteristics that every respondent needs to have in order to be high quality

Leib Litman:

As a, as an end user. I’ve noticed a shift in, in things like Capcha. So I’ve noticed that even that tool itself has really start to humanize the its approach to ensure that it’s picking up on, on characteristics that may, you know, a human may only really understand versus, you know, choose a bridge in the background. I, I’ve noticed a shift on, on the end user and that perspective, and, and, and that’s probably a large reason as to why. So that’s interesting. You know, I, again, we were kind of talking about consumer insights earlier the fact that we’re a supplier panel at an Citrix, we offer in Citrix communities, and we, we service client surveys, but we’ve noticed as researchers who also employ supplier panels sometimes clients may not have the same appetite for data quality controls as, as us in house. Do you have any suggestions in how we can work with our clients to ensure that they understand the importance of these methods and that their data is being well represented as a result? Example, social media recruitment is something that we largely struggle with for all the reasons I think that lab actually mentioned.

Cheskie Rosenzweig:

Yeah, I mean, people just need to become, I mean, I think we need a lot more awareness in the industry about just how deep the problems are. I think, I think this is, this is, this is changing more and more. We see that people are becoming aware of the, of just the scope of the issues. I mean, just this year, if we were just to look at what, what are the things happening in the industry in the last six months? Yep. So we have case, right? Coming out with a huge study, right? Supported by a whole bunch of brands, right? Who really wanted to understand what’s the scale of the data quality issue in the industry. So Case conducted a study with thousands of people from many, many different suppliers all over, you know, standard industry suppliers, right? And their, their question was, Okay, well, what kind of data quality problems are there in the industry?

And what they found was fairly astonishing, right? 30 to 40% of of respondents in a, in a survey are fraudulent, problematic, unusable, right? So what’s the impact of something like that? Well, just imagine, you know, 30% of of your people are fr fraudulent, right? And then the question is, well, what, what does that, what does that mean fraudulent? Who, who are those people? And then again, just last week, right? Just that, not even this week, right? Smar had this whole session devoted to data quality, and they have an, an amazing panel. It had a couple folks from, from Innovate. There were some companies there that provide data quality solutions. Cloud research was there, I was there as part of the panel. And we had this amazing discussion, and it was just an amazing experience because we had videos, right, of interviews with actual fraudsters, right?

Someone who used to make a living in Venezuela, right? Filling out surveys, That’s what he did it all day long, write dozens and dozens of surveys every day, every week. And he no longer does it. So he did this interview where he basically talked about exactly what he did, how he did it, how he infiltrated these different surveys. And then there was another interview with with, with someone from Bangladesh who actually, he’s a physics student in a university, believe it or not. And he makes a living teaching other people how to get into surveys in the United States and in Europe, while they’re living in India and Bangladesh and other countries. And he has these, he has a YouTube channel where he has hundreds of videos teaching people how to do this, right? And so there are just tens of thousands of people out there all over the world, China, Russia Middle East, that when you open up your survey, right, a tremendous amount of that survey is being filled out by these folks. And what that can do to the insights and the accuracy of whatever takeaways that, that, that a researcher makes from, from their, from their research is really hard to fathom, right? And I think that that’s, that’s the most important thing, right? Make making people aware of just what happens if they, if people aren’t doing their due diligence in order to protect their surveys from these kinds of problems.

Leib Litman:

So in retrospect, the, the idea behind it for clients is that the data quality itself is the importance. We’re seeing as an industry, as you mentioned, you know, you mentioned 30 40% can the data can be focus upwards of 30, 40%. So, you know, that that alone is an indicator, I think, for clients to be, to be aware of the, the potential solutions to ensure that when they are conducting online researchers with their research providers, that they’re really ensuring that they’re receiving quality responses. We, we we’re constantly getting questions as, as an insights firm our ourselves, and we’re always looking for ways to ensure that our, our clients really support these types of methods. So I, I’m very happy to have you guys on. In fact, I cannot wait to even share <laugh> this episode with my, my colleagues because we’re, we’re consistently looking for new ways to innovate on this both on this service side of things, but also on the panel providing side of things.

And it, it, it is quite immense how much things have changed in the last year, but I think we, we should attested a lot to education to these types of conversations being had being put out there. You know, the fact that you guys are attending even SMR massive conference and really, you know, leading a conversation there that I, I know a lot of people even came home and, and kind of discussed at our table before we even jumped on the podcast. So this is really important information. And, you know, I, maybe let’s flip this a little on its head. So say the client is incredibly concerned about data quality controls. You know, what, what are ways that we can kinda, you know, provide sample quality tools for panel providers that help ease the client’s worry?

Sharday Torgerson:

Mostly from my perspective, we, we love working with those clients because I, yay. I think there are some clients, unfortunately, you know, they, they hear a trope about, and, and they just believe it, Oh, quality is great, or it’s okay, my, my panel is taking care of it. Right? And they might not even know who their panel is cuz they’re working with a research agency who’s working with the panel. And so, you know, they, some clients just take more of a hands off approach and I think it’s, it’s because they haven’t been to a talk like the, the presentation at East Lamar where see examples of

Leib Litman:

Where got scared silly.

Sharday Torgerson:

So you know, when a client does know about that mm-hmm. <Affirmative>, I, I think there’s a a sense of like, oh, we can actually help. We, we have tools for that. So yeah, I mean, Century is our product that we’ve been building for a while that is really an a really critical layer of protection that can help clients get good data regardless of where the data’s coming from. So Centric can kind of be plugged in to work with any panel source and any, any surveyed platforming tool. So, you know, essentially any, any supply that is coming in to a client’s project is able to be vetted before it reaches the client’s survey. And Century can ask these questions, which again, get at these key human characteristics and verify them in addition to a lot of technology running in the background to make sure that people are moving their mouse in human ways and answering open questions in human ways. And so that is one tool that we are very excited about. And we can only share it with some clients who actually understand that quality is a problem. Cuz the others are just like, Well, of course, of course they’re, they’re good. Of course, they’re human collecting humans.

Leib Litman:

Yep. Yep. It’s quite a, quite a drastic in understanding. But I think what’s really powerful about tools like yours, I think once you plug and play it there, you know, I’m sure the, the validity in the research changes drastically as well. Again, I know from a panel provider perspective it, it’s something that we’re always concerned about. So I got a question for you then, Chesky. What would you say to a sample provider? So a sample panel provider using only a manual process to measure respondents for duplication in fraud.

Sharday Torgerson:

Wow. I would say come, come forward from the dark agents

Leib Litman:

<Laugh>?

Sharday Torgerson:

No, I, I think panel providers are sometimes in a, a little bit of a difficult place. Some, some panel providers put quality first. Another provider put like having as much traffic as humanly possible or not humanly possible first. And that there’s like sometimes a, a balance that’s going on behind the scenes where some panel providers who don’t put quality first, they, they need to focus on their bottom line and they wanted to grow their pool of panelists as much as possible. And obviously there’s, there’s a, a balancing act that needs to be done by each panelist, you know, each panel supplier. And, you know, I think if they’re still manually doing things you know, I, I think they’re wasting a whole lot of time and effort. And I’m not sure that that manual process would even yield high quality panelists because it’s not an easy thing to do to, to kind of really vet a sample and, you know, make sure that the, the people that you’re blocking are the people who are taking a class from a, a physicist in Bangladesh.

 That’s, that’s how I think it’s always good to be able to plug in existing solutions to help with that process. Mm-Hmm. <affirmative> not to take over, cuz I, I think every panel, you know, has their own internal kind of needs and they need to look at the data too. But to be able to, to have some basic baseline and kind of standard for quality and to use a a tool to do that is, is really important. Time saving leads to better data and, you know, is probably more effective. And because it’s more accurate in weeding out the low quality people, you can probably help panels keeping some of that traffic as high as possible while also maximizing quality.

Leib Litman:

I, I, that’s very well said. That’s very well said. And it’s such a, such a common issue that we are, we are dealing with as, as folks who, who are panel providers as well. And, and then again, always making sure that we’re working with folks from a panel health perspective in house to, to be really paying attention to the folks that are coming in and yeah, just ensuring that they’re, they’re humanized to some degree. It’s, it’s, it’s an incredible undertaking. I know we have a full panel health team in-house as well that really manages this aspect. And I’m, I’m on the team, so I definitely have a bit of an insight. But yeah, it’s a, it’s a changing industry in consideration, online research being just much more prevalent these days. So we, we are kind of talking about sample accuracy.

We jumped into this wonderful tool that I’ve been hearing lots about. It’s, it’s definitely a buzz in the industry in terms of providing data quality for market researchers across the board. It, again, it has that real, like for researchers by researchers vibe I’m finding that a lot more appealing across the board in terms of the types of technologies that market researchers work with, because it’s always easier for us to, to kind of employ a tool that we know somebody who really understands these complexities that we’re dealing with. And what I’m hearing is you guys are really investing your time cloud researchers time in supporting these challenges in the industry. So we’re really appreciative of it. But I wanna dive a little bit more into this Century tool. We’ve been kind of talking around it a little bit. But I, I wouldn’t mind hearing about what Century does to leverage more of these behavioral and maybe technical tools to avoid these bogus respondents and, and I mean, in sharing it without giving too much of the meat and potatoes away.

Cheskie Rosenzweig:

So yeah, so Century really has a couple of different you know, critical components that we think are very, very important for ensuring data quality. And I guess I’ll, I’ll just take a step back for a second and, and just say that when, you know, any, any tool, right? Has to be able to address the different kinds of fraud and be needs to, needs to be multifaceted. So for example, some fraud comes from Click Farms, right? And these click farms are from other countries. So we need to, so some tools need to be device focused to be able to see that traffic and to be able to keep track of that traffic and to block it, right? Other fraud comes from not India, Bangladesh, it comes from here in the United States. You know, somebody taking a survey without caring at all about what they’re doing and just responding randomly to the end user.

There’s no difference between the two, right? Because at the end of the day, your data is not usable. So whether it’s somebody, you know, who’s in Bangladesh, right? Posing to be someone in the US who doesn’t even speak English, or whether it’s someone in the US who speaks English perfectly, but they just finish playing some video game and they are being given some kind of re reward for finishing the, the survey and they’re just clicking through randomly, the end result is the same. So tools really need to be able to do both of these things. And traditional tools that are in the market really focus on the device, right? Making sure the device is, is, is, is, you know, coming, coming from the right location that there aren’t too many duplicates, you know, removing duplicates making sure that some kind of you know technologies isn’t being used to, to make the person appear to be in the US when they’re really somewhere else.

All of these different device based approaches are kind of traditionally what, what people have used for protection. But what these, these, these kinds of approaches can’t really do very effectively is to, to address this other issue of people who are not in India, Bangladesh, Russia or whatever, but who are just completely not paying attention, are unengaged, are trying to get into a survey, and they’re claiming to be a nurse when they’re really like, are not. And these issues pose as much of a challenge and even a bigger challenge when it comes to BNB studies where we actually see fraud rates much higher than 40% because PE so it’s so easy to, to pretend someone that you’re not on these kinds of surveys. So that’s really where Century is unique in that it has these components, these modules that we call them, and each module is developed based on the kind of instrument development and validation principles that is part of our background, part of our academic background, right?

Like, this is what we were trained to do professionally to develop instruments for behavioral testing, right? That’s what we do in clinical psychology and doing experimental psychology. So we created these modules within Century that look at attentiveness, that look at, look at acquiescence bias, that look at behavioral metrics. That’s, and that’s the other thing that’s unique to Century. We actually look at mouse movements to identify what are people doing on the screen that’s even separate from the way they answer a question. Like what, what answer they provide to a question, but what, what are they doing on the screen while they’re answering a question, right? So for example, some people will have all these apps within their browser, and then they’ll go off the screen, they’ll click a button, and all of a sudden the entire question, we translate it into a different language, like into Chinese, for example, right?

Mm-Hmm. <affirmative> mm-hmm. <Affirmative>, well essentially automatically detects when that happens. And then that person is red flagged. There are also all kinds of other patterns that are very unique to people who are fraudulently answering surveys. And we call this behavioral metrics. So we use behavioral metrics, we use behavioral analysis, right? And we have all these different modules. We additionally have modules that are specific to B2B segments, right? So if somebody is claiming that they’re an IT t dm, well, how do you know that they are Century has these validated instruments to actually verify that they are an it tdm, that they’re a nurse, that they’re a doctor, that they are, you know, within a particular segment. And and beyond that we have incredible open-ended review solutions that are very smart. They’re real AI people always say ai, you know, that’s like the most everybody’s doing ai.

We have some really, really smart AI that we, we we’re using for open-ended review that is just very, very powerful. And all of these tools are, are, are calibrated and tested to not only be accurate, but to have as few false positives as possible. You know, And that’s really a, a really important thing. You know, people, people use these attention checks and red herrings and so forth, and what they really very much don’t know because they’ve never tested it, is well, how much of, what percentage of good respondents are being prevented from participating, Right? And that’s something that we test for and that we, we really validate as standard part of century development. So

Leib Litman:

That’s a really unique way of thinking about it as well. You know, considering we’re in central Canada in tricks, we’re in Saskatchewan we were kind of raised on this idea that we have to fit within this nationally representative sample of, of the population and yet to have an accurate understanding of where we live you need to have that right distribution of people. So we, we rarely actually get represented accurately according to national statistics as a result because we’re just too small compared to the national average. So usually we, we find in this case, it’s super important to kind of maintain sample diversity for hard to reach segments. And, and you mentioned that century in both cloud research really support even, even conducting things like open-ended responses where we’re finding a lot of the times that we run into, we’re running into issues as a smaller sample base.

We’re looking for hard to reach segments. We’re, we’re trying to connect with people using maybe an open link, and we’re just maybe not seeing those results. I, I, it’s something we struggle with quite frequently as a result of, of kind of the marketplace that we serve. And I really anticipate this type of tool could be really beneficial to the, to these hard to reach segments that I know a lot of researchers are always looking at, I mean, even though we’re a small marketplace, international research is another example where, you know, you’re, you’re consistently looking to, to ensure, you know, sample diversity, better data quality, but you know, you might not even be located in the same neighborhood, the same region, nevermind the same country. So I, I’m, I’m assuming that that tool again would be really beneficial to, to be working with hard to reach segments no matter where you are.

So I guess we’re kind of tailing end on this conversation. You know, we kind of, we, we dabbled in the idea around data quality and how you guys are really investing within the industry, the Century tool sounds like something I need to even tap into. I’m thinking <laugh>, but do you have any, and this might just be open to the floor to the both of you. I’m sure you both have different perspectives on this, but do you have kind of any, any predictions perhaps about the, the future of online research and perhaps, you know, how the importance of data quality will change? I know we’re kind of talking about this, this moment of education that we’re in, we’re in the information era. Hopefully, you know, even folks like yourself can be a real, you know, influencer in that aspect of things. But I, I’m curious, do you, what, what do you see in the future of online research, especially considering, you know, data qualities top of mind?

Cheskie Rosenzweig:

Yeah, I guess just seeing what’s happening in the last half a year there’s a, there’s a tremendous increase in, in, in awareness. You know, I already mentioned ISR mentioned the case study, but Insights Association now has created a data quality task force. And we we’re on that task force, so I, we we’re just seeing the different, the different, the different people who are on the task force who are representing so many different segments of the industry. Crc we are that, that we are with that task force making a presentation about data quality issues at, at the crc in New York. And we are putting together a whole toolkit really for, for, for the industries to, to become more aware of the different pro, the scale of the problem, the, the kinds of impact that this problem can have on the bottom line, right?

And the kind of solutions that are out there, and just some best, best practices. And so I, I think that this, this kind of awareness and the need and the demand really within the industry for us to do better, right? As an industry, it’s just gonna, it is just gonna increase more and more because just seeing the kind of conversations, we’re just having conversations with people every single day. And also just seeing the, the, the specific ways in which data quality problems really impact brands. Like at the, at the last quirks in New York conference, just a, a a month ago there was a talk about data quality. It was presented by case in collaboration with Pep PepsiCo, and they gave some, you know, incredible examples where there was, there, there was I think it was p and g that they, they ran a study with, with a particular go to market campaign for, for a product I think it was a Crest or something like that.

And they, they conducted this whole study, they invested a tremendous amount of money in it, and they, they, they, in, in this study, respondents said that like 54% of respondents endorsed this product said it was a great product, right? So they went to market with the product, and it wasn’t selling it was getting horrible reviews. People hated it. So they were like, What happened? Right? Why, why, why? We were getting such positive feedback with this study. And then turns out nobody liked the product. So then they went back, they looked at verified respondents instead of 54%, it was 24%. So they were figuring out what was going on. Well, it turns out in the original study, there were all these people who claimed to have used the product, but they didn’t, they didn’t actually use the product. They were just making it up.

And people who make things up tend to be positive about everything. Yes, they’ll say, Yes, yes, yes, yes, yes to everything. Yes, we use the product, Yes, we love it. Yes, we weren’t gonna use it again. Yes, we’re gonna pay a lot of money for it, right? And so this influence the entire campaign, millions of dollars, and it was just wasted, right? And this, this kind of stuff happens all the time and people just aren’t aware of it. But now people are becoming more aware, aware of it, and they realize that, you know, whatever resources they, they spend that they allocate toward preventing fraud is well worth it, considering how much it costs down the road. If you, if you don’t address these, these issues. And you know, we’ve, we’ve seen this so many times in other, in other ways. We, we’ve you know, the, the, the Center for Disease Control came out with a study where they, they concluded that during covid, like 10% of people were drinking bleach to program covid, right?

Well, it turns out 0% of people were doing it <laugh>. And the 10% were these folks in Venezuela and these other people in the United States who were just like, yes, yes, yes to everything. And the only people who were saying that they were drinking bleach to prevent covid, right? Were people who also said like, Yeah, we you know, we live on Pluto and yet we eat concrete for breakfast. And yeah, like crazy things like that. Just people who we call naysayers. So, you know, fraud is affecting our society, it’s affecting science, it’s affecting the industry. The more people become aware of it, the more they’re gonna realize, you know, we can’t just keep going this way. And so, you know, to answer your question, you know, what do, what do I think, think is gonna happen in the future? I think in the future, in a very near future the demand for change is going to really increase tremendously. And we’re gonna see, and we’re gonna see change because there are tools already available. There are best practices that are already well known, is just a matter of people starting to use that.

Sharday Torgerson:

Yeah. iTune quite optimistic. And I, I think it, the change that’s gonna come like is going to have to be driven by like both sides. Yeah. Client awareness is gonna continue to grow, and as client awareness grows, demand for better quality is going to reach a point where it’s actually in everyone’s best interest to really put forward strict standards for quality. And, you know, once we have that awareness and standards within the industry I think we’re gonna be in a better place.

Leib Litman:

Well said that I honestly, I couldn’t agree more. I, it, it’s such an in interesting industry where we’re actually trying to really meet those standards with what we’re doing. And, and the client education is so important as well. So I, I honestly couldn’t agree more. I appreciate both of your guys’s time today. We could probably talk all day about data quality. This is a topic that you sound both of you’re very passionate about, and I think this is something that we’re gonna be tackling more and more as we, as we go on. And I’m sure as an industry standard yourself we’ll be hearing more about what cloud research is doing to invest in this area. So thank you to you both for, for joining me today. I’ll leave you with any final thoughts but I really appreciate both being here.

Cheskie Rosenzweig:

Thank you so much for having us. It was, it was a, it was a lot of fun. Great.

Sharday Torgerson:

Pleasure. Awesome.

1 2

More from this show