16:00:18 I'll start off by introducing our our lovely colleague, Christine Weaver. 16:00:26 She's been in the veterinary industry for over a years. 16:00:29 The majority of that time has been in neurology, where she obtained her vts in 12. 16:00:36 In addition to the clinical work she is engaged in training, education, speaking, and writing, she truly enjoys promoting the Vet tech career and seeing people grow within it. 16:00:47 Recently she graduated from Penn State with a degree in biobehavioral health, where she studied research, evidence-based medicine, and and how to leverage scientific evidence for developing opinions and perspectives, in numerous walks of life this is what 16:01:05 she's gonna discuss today. She's very passionate about it, and she will introduce us all to this topic. 16:01:10 So we're welcome, Christine, and thank you so much. 16:01:13 And thank you, Kate, and thank you, Liz. Linda. 16:01:16 Everyone for this opportunity to speak to you guys and everybody who's attending this evening. 16:01:21 It's a Sunday evening here on the East coast. 16:01:25 If it's kind of midday for you guys over on the West Coast, regardless. 16:01:28 Thank you so much for taking time out to listen to me. 16:01:31 Talk about something that I'm extremely passionate about, which is kinda odd, right? 16:01:35 Like, who gets really excited about scientific literature. Well, people who actually love learning for the sake of one of the things that's important to me, and one of the things that we'll briefly talk about in today's presentation is being able to understand the viewpoint and getting into the 16:01:52 mind of the author, or Hermeneutics, of whatever article or whatever information that you so for me, what I want you guys to understand about me as I'm presenting this information to you is one I do learn for the sake of I want to understand the world. 16:02:07 Around me, because, in a way much like one of my favorite songs here. 16:02:11 Great show on earth by night. Wish I really do truly believe, truly, believe, that one of the most noblest pursuits that we can have is the pursuit of truth, and how we can apply it to our day to day with regards to medicine. 16:02:24 It's how we apply it to alleviate suffering. 16:02:26 So I am a nerd turn 3 you can even see the space that's behind me, and that's just because I'm just absolutely in love with the universe. 16:02:35 What I do know what I don't know, but to be. 16:02:38 Bring it back to a little bit more grounded situation. What I would like for you guys to come away with today. 16:02:47 So one, I want to make sure that you guys have an understanding of what science means. 16:02:50 Kind of a little bit of its history. So one, I want to make sure that you guys have an understanding of what science means. 16:02:56 Kind of a little bit of its history, and how scientific I want you to understand the types of scientific literature that we find most commonly in veterinary medicine, knowing that there are huge amounts of types of literature that can be found. 16:03:04 And coming away with a basic understanding of certain study designs, certain types of articles. 16:03:09 Their strengths, their limitations. And then how you can apply this information for whatever purposes you have. 16:03:16 So if you are a vts applicant, woohoo awesome glad to hear it. 16:03:20 This kind of information will help you, because we now require you guys to use scientific, literature as part of your case. 16:03:27 Reports. So knowing what kind of articles you're going to want to look for in the best places to find them is going to be key. 16:03:34 But some of you may also really be interested in things like providing professional presentations in which you would need to do the as we are supposed to be on the forefront of medicine. 16:03:46 So how do we stay on that? So that we can present that information to our colleagues, both in presentations such as this Powerpoint and right, and also just for learning, for the sake of learning like I said I am a complete nerd after I graduated from penn state I decided I wanted to 16:04:00 learn about paleo neurobiology. So finding something that you're excited about and being able to learn about it is a lot of fun with that in mind I don't want to confuse anybody about what my intentions are. 16:04:12 I really do want you guys to understand scientific literature in the most basic sense. 16:04:17 You are not going to be an expert. You're not going to be able to read Andrew Weekfield's article from the late nineties saying that vaccines, cause autism, and be able to pick it apart and be like well, it was obvious it was a bad study you are not going to be that kind of an 16:04:29 expert you just kind of want to understand what's out there. 16:04:32 The other thing is that I will provide some examples of scientific literature just to provide a picture of what I'm talking about. 16:04:40 But I don't want you to think I'm trying to sway your opinion on any one particular subject, whether it's veterinary or otherwise. 16:04:47 So there are few things that we'll do to get to their number one. 16:04:51 We're gonna review a little bit of history because that's just who I am. 16:04:55 We'd love the history I'm gonna introduce you to some of the basics of scientific literature we are going to do a brief statistics Review, and if you were any room with me right now, I would tell them to lock the doors because it is not that scary and it is 16:05:07 important to understand some degree of statistics. When we're reading these things we're going to talk about some study designs, how you can leverage this when you leverage each one and then the other thing is my my soap opera segway which usually means if you see the soapbox pop up 16:05:22 on the screen. It's because I've a really important point that I wanna make sure that you understand. 16:05:27 So we're gonna start with a question, because science and philosophy aren't actually all that different. 16:05:35 So I'm gonna time. And I'm gonna give you like, 1015 s to think about today is Sunday. 16:05:41 But how do you know with certainty that today is Sunday? Watching the clock? 16:05:47 Is that do want to give you guys the opportunity to think about it, because it kind of blends into how we know what we know about anything right? 16:05:57 So we know today is Sunday, probably in part because our calendars told us. 16:06:02 But then also yesterday was Saturday, and we know tomorrow's Monday. 16:06:06 But where did this start? And essentially, what happened is one day somebody decided it was Sunday, and everybody's like Sunday. Sunday. 16:06:13 Yeah, it's cool, it's cool. We'll just call it Sunday, and then the next day will be Monday. 16:06:17 And then one day missed several generations down. Somebody told you that this is what Sunday is, and then you're gonna pass it down and we'll continue to pass it down and we'll continue to pass it down long after we're gone. 16:06:28 So there's no way that I can run a test or a design to say that today is Sunday, and that's really kind of how science started. 16:06:37 I mean, before the enlightenment we just took people's words for everything. 16:06:41 If they were an authority, figure in any way, shape or form, we're like. Yep. That guy's smart. 16:06:44 We are absolutely, going to follow what he has to say. We accept what we are being told as the truth, and we pass it down and move on much like today is Sunday. 16:06:54 I am not encouraging anybody to check. Whether or not today is Sunday. 16:06:58 It was just for an example. But another good example is a humoral feature of medicine, where it was actually thought for hundreds of years that what made us sick was an imbalance of certain fluids in our body. 16:07:11 So nobody challenged it. Nobody wanted to say, Is this really true? 16:07:14 We just accepted it and moved on and then the cart came along, and he made things. 16:07:20 It's just a little bit more complicated. And again, this is where science and philosophy again come together. 16:07:25 He introduced the idea of the radical skeptic imposited. 16:07:28 Have you ever tried to unlearn everything that you have ever learned in your life, and then only set whatever has been put in front of you? 16:07:37 It's a very interesting, I guess, thought Experiment, that if you ever want to try, it's apparently a very enlightening thing. 16:07:46 Think about the fact that you would never know if you have never been to Antarctica, that Antarctica doesn't exist. 16:07:52 Anything you've never seen, touched, felt, doesn't exist. 16:07:55 So that's the idea of the radical skeptic if you have not seen it for yourself, it does not exist, and Descartes even took it a step further. 16:08:04 So we have this idea of being a solidist. And these are people who genuinely believe the only thing that they can possibly know to truly be true is that they exist. 16:08:14 Our senses could deceive us. What how do I know what I'm seeing as actually in existence? 16:08:18 So it became a very interesting thing. Now Descartes certainly challenged the status quo and kind of started that ball rolling. 16:08:26 To think maybe we should start to challenge the things we think that we know. 16:08:29 And that's when we started to see the rise of empiricism and epistemology. 16:08:34 So again, this is going to be where we are going to use our senses to evaluate the world around us. 16:08:42 To observe what's going on so again, this is going to be where we are going to use our senses, to evaluate the world around us, to observe what' what is distinguishing our justified belief versus just an opinion, so you can kind of see okay, so this is a very 16:08:58 philosophical start, she started off saying that she's been philosophically, a you know, a lover of learning. 16:09:04 So what exactly is signs? And I thought this was an interesting question. 16:09:08 That was posed to me in one of my Cles. 16:09:11 They literally just asked, What is science in, as you're sitting there now, how would I define science exactly. 16:09:17 And if you're like me well, at the time, if I was smart I would have gone to the Oxford Dictionary and said, Okay, intellectual and practical activity encompassing the systematic study of the structure and behavior of the physical and natural world through observation and experiments and that is a 16:09:32 horrible sentence, but that is what is in the Oxford Dictionary. 16:09:35 And if any of you are interested in presenting sometimes it's kind of fun to throw out just random dictionary terms but then always follow up with something that's a little bit more specific to what you're talking about, because the science cancel comes back and says it's the pursuit and application of 16:09:50 knowledge and understanding of a natural and social world, following a systematic methodology based on evidence. 16:09:57 So kind of saying the same thing but here's some questions. 16:10:01 It's a pursuit. It's an application. It is an active process. 16:10:05 We don't just sit there and go. Science. Yes, science was done, and science exists. 16:10:10 And now this is science, science can only truly be science. If we are always challenging it. 16:10:15 We are always testing it. We are always using the most up to date technology in the most up to date things going on in the world to be able to evade. 16:10:24 So? How does this, then apply to your Vts application, or just reading scientific literature in general, and how that is, we know the half-life of a drug right? 16:10:37 It's the amount of time for half of a drug to be metabolized and no longer useful in the body. 16:10:41 Knowledge is kind of the same way. The half-life of knowledge is about. 16:10:47 It's about how long it takes for half of what you've learned to essentially be metabolized and no longer be useful to the body of knowledge or the body of knowledge in the industry that you're interested in, and all in all it's going to change from industry to industry right so it's you know 16:11:05 whatever is going to be on the forefront of genetics and epigenetics that half-life of knowledge is going to be very different than my love of paleo neurobiology. 16:11:13 That doesn't mean that things don't change over time, especially if you look at how dinosaurs we thought that dinosaurs looked 50 years ago versus what we've learned now, but it's just understanding that half of what you've learned is going to be working in the not too distant 16:11:29 future in 2,012 Samuel Arbonson he actually published a book about the half-life of facts in general, and he gave about 5 years for the medical profession. 16:11:37 I couldn't find anything very specific to veterinary medicine, but this was about as close as that. 16:11:43 I could come. But for an example there was at Harvard there was a lecture on cancer research, and he actually said that really the half-life is about 18 to 24 months. 16:11:56 So what exactly it is in the veterinary industry. We're not a hundred percent sure. 16:12:00 But we do know that the longer time goes on the less we learned in school is actually still accurate. 16:12:08 So with that in mind, take a moment as you're looking at this and thinking, okay, well, I graduated tech school. 16:12:13 In this year, and for me. That is where my half-life of knowledge sits. 16:12:21 So in theory, I've got about 12 and a half percent of what I learned in tech schools still applicable to what I can use on the floor and in my training and education today, that's not a very large percentage. 16:12:32 It's and that's why it's so important that we do things like ce continuing education and also reading these scientific literature, these articles, the studies that are going on. 16:12:43 It's not just going and learning about anesthesia, or just learning about internal medicine. 16:12:47 It's honestly learning about what is on the forefront and what is changing in medicine. 16:12:51 So now I am not saying that if you graduated 15 years ago, that most everything you learned is garbage and throw it out and forget it. 16:12:59 But like all things in science, sometimes we just need to challenge it to see if it's even still true. 16:13:04 So with that, my slide got reorganized and I'm so sorry that is supposed to just say scientific literature, because that's what we're going to move on to. 16:13:13 And I liken our body of knowledge around scientific literature kind of like this color wheel, where we have primaries, secondary and tertiary resources, and one is not better than the other. 16:13:24 They just work together in concert to create a picture that we're looking at. 16:13:29 And we start with primary literature, and this is where we're going to focus the majority of today's talk is going to be primary literature. 16:13:35 And this is original research. The this is the research, the authors on these articles are the ones that actually did the research. 16:13:43 And you want to find reliable sources of primary literature, because there are some journals. 16:13:49 There are some magazines out there that will publish whatever you want them to, so pure review journals are the best way to find good primary research. 16:13:59 Jvm. Is a great example of a peer-reviewed journal, that where you can find all kinds of up to date literature, the way that you would identify if you're just flipping through a journal or you're flipping through any kind of magazine, we can 16:14:11 identify these by their Imrad. So basically, you're going to see an abstract, you're going to see methods and materials results. 16:14:18 Discussions, conclusions, conclusions, conclusions. So they're broken down in a very systematic way. 16:14:22 Just as science is designed to be very systematic when we approach it. 16:14:27 They're good and bad. Just be aware that primary research is really good for direct information on a very specific study like we said, this is the resource researchers published their information about this study on this topic. 16:14:41 It's also can be very timely. It's the most up to date. 16:14:44 You can get the drawbacks is that it can be kind of hard to follow, especially if you're not using reading scientific literature and me, who I'm very familiar with it. 16:14:52 Sometimes I get lost in some of the words, so with that it can be kind of hard to discern what's good with's a good research. 16:15:00 And what's a bad research? And again, that is not the goal of today's talk. 16:15:03 So with that it can be hard to discern that if you see this groundbreaking article about something amazing, it can be a little bit difficult to say. 16:15:13 Should I change everything that I do? Based on this article, or should I hold back for a moment? 16:15:18 We'll talk about the merits of that secondary literature is not original research, but it's summarizes information that was found in a bunch of primary research articles in order to cover a specific topic in depth and it's not gonna be something in the neural world like hey we're just gonna talk about 16:15:35 you know. Interval, this disease. It might be something very specific about interpretable. 16:15:41 This disease, like genet factors or environmental factors. 16:15:44 And these articles are usually pretty easy to snuff out. 16:15:49 They usually say review in the title, or somewhere within the introduction or description of the article, and you can also tell by these that what these articles are, because, unlike primary research, they don't have that materials and methods meaning that they there's no transparency in how these 16:16:06 articles actually, or how these authors put these articles together. 16:16:10 The, exception to that is going to be systemsic reviews and meta-analysis, which I'll go over, and then tertiary literature. 16:16:17 I think we're all familiar with this, because we we all went to tech school. 16:16:22 We all understand what a textbook is. And really this is going to be the most commentary resource that we have. 16:16:26 And it's going to be a very broad summary of a scientific topic. 16:16:31 So you can see. Here we have veterinary neurology. 16:16:33 The most common thing. I think everybody, that every textbook that everyone has is that is the Mckinnan text. 16:16:39 And these things are really great to have as a reference. 16:16:42 Whenever you're trying to look up general background information, general facts. 16:16:48 If you're trying to write your case, report, you just need to be like, Oh, I need to use, you know, information about what exactly it means to, you know for type one Hansens type, one disk extrusion. 16:16:57 So really good for that. They're much easier to read. 16:17:00 Unlike our primary research, the limitations are well again, half-life. 16:17:05 If we kind of go by the medical profession, which again I recognize, may not be the most accurate for us in bet mad! 16:17:11 But if they have, life of knowledge is about every 5 years, and textbooks are published about every 10 years. 16:17:18 You can imagine how much information that's in these books that may actually be outdated. 16:17:22 So if you are using textbooks, make sure you have the most up-to-date addition to ensure that you're giving good information. 16:17:30 So with that we'll go ahead and start with primary research. 16:17:34 And, as I mentioned previously that they can be identified by just looking for how the actual article is structured. 16:17:40 Are you seeing an Imrad introduction materials and methods results in discussion? 16:17:45 The introduction is going to give you essentially the why? The question that they're trying to answer and that's how every primary research study starts is what they haven't question, that they would like to answer they are then going to review. 16:17:57 In their materials and methods. Section, how they're actually going to gather information to try and answer that question. 16:18:04 And then they're going to present the results to you and then have a discussion of what those results mean. 16:18:08 Both for the clinician that's reading it, or the vettech that's reading it. 16:18:12 Or the end user themselves. Again, these things can be very granular, and the information is really primarily going to be limited to the study. 16:18:21 So if you have a hard time reading through these or you haven't read through them before I say, you know we can look for the quote. 16:18:28 Clip's notes, which the entire article is important. 16:18:32 All of the information is important. But if you're trying to pull out big points and look at the abstract, it's going to tell you what they wanted to do, kind of briefly how they did it and briefly, what the results were. 16:18:44 Look at the discussion and conclusion, reading through materials and methods can be really hard. 16:18:48 Sometimes. So definitely, these are the big areas you'd probably want to look at. 16:18:54 So, and then we have the actual study design. So there are so many different types of study designs out there. 16:19:03 I am barely scratching the surface with what I present to you guys today, because I feel that they're the most relevant to what we read in Betmead. But just know there are tons of other study designs that you can come across when reading journal articles so again, the whole 16:19:15 thing? Is they're going to choose a design because they're trying to answer a question. 16:19:21 And usually that question is whether or not to variables, or more than that, are related in some way, and each each kind of study design has its strengths and its weaknesses, things to think about would be things like retrospective or prospective studies. 16:19:34 Observational versus experimental, quantitative, versus qualitative. 16:19:39 And before we go too much further, I did promise you a statistics review I didn't want to let you now down, because I know that's what everybody wanted to do on a Sunday afternoon is talk about statistics. 16:19:50 I love statistics. I'm one of those weirdos. 16:19:52 And I recognize that. But there are some very basic things that you guys should understand as you're reading through these articles. 16:19:59 And number one is going to be your population versus your sample. Right? 16:20:02 So your population is the entire group of things that you want to study. 16:20:08 So maybe it's all United States citizens. Maybe it's all dachshunds that are out there. 16:20:13 Maybe you want to know about all veterinary technicians. 16:20:16 You can't reasonably study people or animals in that way. 16:20:21 So what we do instead is we take a sample, and that sample should be representative of the actual population at large, so wouldn't make sense to choose people just based on, you know, socioeconomic status. 16:20:34 Or I just found a bunch of people in New York City. 16:20:35 I'm sure that'll translate to the rest of the United States. 16:20:40 And with that sample they're going to be studying what we call a variable. 16:20:44 And this is actually a really hard thing to define as well. 16:20:47 But basically, it's a thing that changes in the study. 16:20:48 And so essentially a lot of times. What we're trying to figure out is, does one variable or a change in one variable effect? Another one? 16:20:57 So if you think about it, very simply, with regards to things like healthcare, right? 16:21:01 Does smoking influence somebody's chances of getting lung cancer. 16:21:06 Of course we all know the answer to that but those are the kinds of studies that were done in the past that we now take for granted. 16:21:12 You might not also read, particularly if you're trying to talk about certain diagnostics that was leveraged for one of your case reports and why it was leveraged. 16:21:21 Things, like sensitivity and specificity might come up. How often does the diagnostic that you are using going to actually identify the disease in patients that have that disease? 16:21:33 Versus how often is it going to not identify disease in individuals that do not have the disease? 16:21:40 So that's the difference between sensitivity and specificity. 16:21:43 And we'll briefly come back to that in another state and then, as I stated previously, if I'm interested in the entire population, United States, I am not going to just grab a random sample of people, I find on the street in New York City and then say I'm sure this information applies to 16:21:58 everybody in the United States. The generalizability of the information in that study which you might hear researchers say like, is it generalizable? 16:22:07 Not generalizable, they should be very open about that and essentially it's how truthful or valid there results of that study are to the whole population that you're interested in applying this to. 16:22:18 And that's pretty much the same thing as your external validity. 16:22:22 But internal validity is. How true real, valid those results are going to be to your study or your sample. 16:22:28 And those things are very different. And they're very important to understand how they're different. 16:22:34 And relationships are probably one of the most important things to all of us. 16:22:37 However, we choose to identify or live in our relationships, and science is no different from that. 16:22:46 We are always trying to see if one thing is related to another. 16:22:50 If manipulating one thing will change another for the betterment of our patients. 16:22:55 So number one, we can see that association is really just any relationship between 2 variables. 16:22:59 And sometimes it's enough to say there is just no association. 16:23:02 So it could be weak. It could be strong, it could be positive. 16:23:06 It could be negative. So it's any relationship so just because something's associated or someone says, this is associated with it, really, just means that there seems to be some kind of a relationship. 16:23:17 But we don't necessarily know what it is. So your co-varation is 2 variables that literally changed together in one way or another as one goes up. 16:23:26 Does the other one go up as one goes up? Does the other go down, as you can see in this negative correlation? 16:23:36 Or is there just no correlation or no association with so ever? 16:23:41 So you can see here what we like to do with correlation. 16:23:44 Is we try to make it a linear relationship so here you can see somebody. 16:23:47 They've actually put a line in to try and describe the association and the co-variation of these variables. 16:23:55 And so we will also assign a strength to that. 16:23:58 And that's on the next slide, and the last kind of relationship is causation. 16:24:02 And this is the one. This is that golden idol from the beginning of Indiana Jones, where we're very delicately trying to get that causation because that's the ultimate goal. 16:24:14 We want to be able to prove causation, but honestly it's one of the single hardest things that we can do. 16:24:19 It is. If anybody says they did this one research, and they absolutely proved causation, then you should automatically throw that over your shoulder, because that is not for the most part, how science and determining causation works 2 other things with p-value. 16:24:34 P-value is essentially just a way of saying what is the likelihood that your results happen from random chance? 16:24:39 Because random chances always possible. And that's another reason why we want to repeat studies and challenge the status quo. 16:24:46 What if what we know was essentially due to random chance we didn't even know so in an ideal world, we want this number or you want to be able to read this number to be less than 0 point 0 5. 16:24:57 But regardless the lower the number the better. Just understand, there's no such thing as a p-value, such as 0, because we just do not accept that there isn't the possibility of random chance. 16:25:09 The other thing is, when you see things, when they talk about this relationship with correlation, as I showed you on the previous screen, you can see that we have these plots here, and you can see some look like a nice strong strain line. 16:25:21 They look very strongly related, whereas this one just kind of looks like a hodgepodge. 16:25:26 It seems very so. We really want numbers that are close to one you can see this is a negative point 9. 16:25:30 This is one. This shows us there's a very, very, very strong relationship. 16:25:35 And if it's positive, it means that as one variable increases, the other, variable increases with it. 16:25:41 And if it's negative, it means that as one variable increases, the other, one goes down so it's just kind of random descriptive term of what we're looking at. 16:25:52 But can give you an idea of how strongly that 2 variables are related and so we're seeing this same chart again, where I talked about sensitivity and specificity. 16:26:01 Now we're talking about type one versus type, 2 errors. 16:26:04 You will see this in some of the scientific literature you read, and essentially the long and short of it is a type. 16:26:09 One error is, I thought I had something really significant and cool, and it turns out due to random chance. 16:26:16 I didn't or due to a confounding variable. 16:26:19 I didn't. A type. 2 error is where you have a false n negative, or you didn't think you had good information or didn't think you had something significant. 16:26:30 Or you didn't think you had something significant. But in fact, you did. You? 16:26:34 Just with that you had the dumb luck of not catching it another reason why it's important to have several studies on any given topic. 16:26:37 The confounding variable in this is anything that's not directly related to the hypothesis or the question at hand, and I apologize. 16:26:45 That my image is over top of that something has happened with me, sharing my Powerpoint. 16:26:50 Apparently, but it's outside of the question at hand. 16:26:55 It's outside of the research. It's not something that the researchers anticipated, but it is affecting the outcome. 16:27:02 And I'm gonna have some really fun examples of that coming up. 16:27:05 So what you might see when you read things they might say the researchers controlled for age, sex read things like that. 16:27:14 So that was their attempt to be like these could be confounding variables. 16:27:17 We're going to control for them, and I'm not going to go into what that means or what how you control for those variables. 16:27:23 But just know that there can always be a confounding, variable, and we're going to do a couple of exercises to get you thinking about those in the big thing that I want you to think of is that just because things are associated or they are correlated in any way it does not mean 16:27:36 causation. There are 4 things that have to happen in order for there to be causation number one, there has to be an association, I mean, like that. They have to be related somehow, in order for there to be causes. 16:27:47 The, supposed cause has to precede the effect you have to rule out whether or not there is a reverse correlation, meaning, I thought, a cause. B. 16:27:55 But it turns out the causes. A, and that's how they're related or bidirectional where a causes B+B also influences A, which in turn then influences B again, you also want to make sure that other more reasonable ruleouts have been considered and that really this is the most 16:28:14 common, the most basic realistic exploration for what's going on in the natural world. 16:28:19 And there also has to be a plausible mechanism of action. 16:28:22 And so, if these things cannot be met in the article that you are reading, then they cannot imply causation, nor should you come to that conclusion. 16:28:31 The general rules of thumb number one. No single observational study can imply causation. 16:28:36 We'll talk about what observational is just known, that it cannot. 16:28:42 You cannot imply causation with with an observational study you can with multiple observational studies, as in smoking and cancer. 16:28:50 But you cannot do it with a single one. Experiments like random, randomized, controlled trials, can imply causation in a single study, especially if they're released. 16:29:01 Strong. But this is not frequently the case, and, like I said, there's always random chance. 16:29:04 We should always challenge and repeat, to make sure that we are closer to the truth, and it is never possible, even with the best of circumstances, to ensure 100%, that there's a causal relationship with that in mind, I'm still not going to pick up smoking pretty sure it's going to give me cancer. 16:29:19 And I'm okay with never having a cigarette in my life. 16:29:23 So to talk about why relationships are tricky. We talk about correlation, not causation. 16:29:28 And I just love this exercise, and if you've seen it, forgive me. 16:29:30 You're going to have to go through this with me. 16:29:32 But, as you can see, this, this is the kind of chart you might find on things like social media or random magazines, especially tertiary or cornary type articles and it shows here very clearly that, as we see ice cream sales, increase so to do shark attacks and as ice cream 16:29:50 sales go down. So, to do. Shark attacks now, obviously, nobody's actually implying that ice cream and shark attacks are truly related. 16:29:59 I mean they are related. They are associated, but we're not saying they's a causal effect. Number one. 16:30:05 I want to know what my R. Value is like, how closely are they related. 16:30:08 This looks like a doodle graph you know, if you're not seeing good numbers, I wouldn't pay too close attention to it. 16:30:13 I also don't have a p-value, so I have no idea how what the likelihood is that this is random. 16:30:20 The other thing is, I have no references for this. 16:30:22 I have no idea now. I happen to know. This came from Uc. 16:30:25 Denver. But this had something else that was, you know this is part of the article. 16:30:29 But where did they get their information from? And did anybody even bother to consider a confounding variable and as you're looking at this, whether you've seen this before or you haven't seen this before, you automatically know this is silly. 16:30:42 Of course you know these things aren't truly in a causal relationship, but this is what we mean by confounding variable. 16:30:48 Both ice cream sales and shark attacks increase when the weather is hot, like nobody's going to the beach when it's snowing outside. 16:30:54 Well, people do that, but God bless them! I don't know why they do that. 16:30:58 They're not buying ice cream, though so they're not caused by each other. 16:31:00 But they're actually related to a confounding variable, which is hot weather. 16:31:05 So let's talk about another silly one. This is one of my favorite ones, one because I absolutely love Cheese. 16:31:09 I'm so sorry, Vegans. I do not mean to offend you, but cheese is my life. 16:31:13 I love cheese, and so what we can see is that there is actually a relationship between per capita cheese consumption and the number of people who die when they become entangled in their bedsheets. 16:31:24 I didn't even know that was a way to die. 16:31:25 But apparently it's a thing, and it happens with some relative frequency. 16:31:28 If you enjoy these kinds of charts, I recommend that you do go to spurious correlations. 16:31:33 They have amazing tons of these kinds of correlations for you to to noodle over. 16:31:38 But here's the thing right? 16:31:43 As silly as it sounds. We have. We actually have references. 16:31:40 Yeah, yeah. 16:31:49 And we actually do have a p-value or no sorry. 16:31:51 We have a quarrellation value. Right? So here we can say that the data is from the Us. 16:31:56 Department of aggregulture, and the centers from Disease Control so these are 2 very reputable sources that they were able to get this information from. 16:32:03 We can also see that that correlation is point 9 4%. 16:32:08 That is a really strong correlation. It is close to one. 16:32:11 So I know these things are actually, really strongly related. I invite you guys to try and figure out what the confounding factor is here, because I could not find out what it was. 16:32:21 If you want to put it in the chat we could have a lot of fun just trying to figure out what it could be. 16:32:26 But I also want to give you an example of something else that you might find particularly in scientific literature. 16:32:32 In health, related news in social media and anywhere. So this is another example where I have a very reputable resource. 16:32:39 I have the Organic Trade Commission Department of Education Office of Special Needs. 16:32:45 So they actually collected this information from very reputable sources. 16:32:48 I can see that my are is there? That's about as close to one as you can get, meaning that right up there as my organic food sales increase. 16:32:59 So, to do. Autism diagnoses in the United States, and my p-value is not 0, because there's no such thing as 0. 16:33:06 But it is close to 0 is one which means this is statistically significant. 16:33:11 This, truly, is an association, but I think we all have a lot of reservations about saying, well, clearly, organic food causes autism, right? 16:33:20 Well, probably not. It could be bi-directional right? 16:33:23 Like, maybe the more people hear about autism, the more likely they are to buy organic food because they're concerned about pesticides and chemicals in the food. 16:33:31 So we can very easily be that it could be those who have an autism, a child with an autism diagnosis are more likely to buy these so these are the kinds of things. 16:33:40 Why I caution people to automatically associate causation with anything that they read because there's so many compounding variables out there. 16:33:49 And yes, I use silly examples, because I believe we need to use those to really home in on these concepts. 16:33:54 So it's important because a lot of these researchers are going to try and say again, they're trying to answer a question, they're going to try and make a connection between variables and how we want to change our medicine change our diagnostics. 16:34:07 Why we should use what we use based on these rems. 16:34:11 And the problem is that when we just assume that this association means that it's causation number one, they're mainly especially depending on the study. 16:34:22 Again, random chances never 0. Confounding variables actually we don't think that the sharks are coming up on shore eating people because they have ice cream again. 16:34:33 We talked about bi-directional or reverse correlation with regards to organic food sales and Oxism, spectrum diagrams. 16:34:41 And then again the design of the experiment experiment can also, and I say, experiment, that's the wrong word. 16:34:47 The design of the study can often affect. How these mistakes, or how these results can happen. 16:34:54 So I will go over briefly, because I did have that quick chart. 16:34:57 We have observational studies, and then we have experimental, observational. 16:35:00 I don't know why I always have in my mind that I'm sitting in the middle of a field like I'm in the sound of music, counting butterflies, but essentially it's it's there are researchers who are literally just staring at the world. 16:35:10 Whatever topic, whatever variables they have and they're just writing things down there is no manipulation of variables whatsoever which I know what you're thinking. 16:35:20 Why would anybody do that? How could we really understand the world well, there are plenty of instances where you just can't manipulate variables. 16:35:27 Maybe we. It's a situation where it's too dangerous, right? 16:35:31 Like, I'm not gonna take a group of kids and feed them all organic food. 16:35:34 And another group of kids, and let them eat the standard American diet and another group of kids, and let them eat the standard American diet and figure out oh, is there an association between offices and specs like the results can be catastrophic if you're right or wrong at times? 16:35:44 Same thing with cancer and smoking. There's never been an experiment on that, not in people that's all based on observational studies. 16:35:52 And then sometimes it's just impossible to manipulate the variables right? 16:35:55 If somebody lives in New York, you can't just say, Well, can we pretend they live in California instead? 16:36:00 There are certain things certain Democrats that we simply can't change, and then the experimental is where we can actually manipulate those variables. 16:36:08 We can control the compounding variables. We can control a lot of things so that what we're looking at is good data that is really just spearheaded. 16:36:15 Right towards that question, that answer that we want to of the question that we posed. 16:36:22 The other question, is this something where I'm taking data from the past much like the spurious correlations and the information examples that I showed you. 16:36:31 Those are all retrospective. Somebody looked at data grabbed the data, put it down and put it into a spreadsheet, and that's how they found out. 16:36:38 So that's retrospective. It's looking in the past. 16:36:41 Whereas perspective obviously means that we're going to start a study. 16:36:45 And we're going to follow it through. We're collecting data as the experiment is actually happening. 16:36:49 And so in those prospects, we can actually control and manipulate the variables. 16:36:53 Whereas in retrospective, we're extremely limited in our ability to do that. 16:36:59 And I wasn't gonna talk briefly about this. But I'll go ahead and say, there's qualitative and quantitative quantitative is what we think of typically with the statistics and we have all of our charts and numbers. 16:37:09 But qualitative is also really important. And I, it's kind of misleading that I have a title on here. 16:37:14 Numbers versus feelings, but you can never take out the human element of what we do in veterinary medicine. 16:37:21 We all have always joked that we're like we got an event, because we don't like people. Well, people run. 16:37:26 We always experience people. And you know the lives and perceptions of those people taking care of those animals can absolutely influence outcomes. 16:37:34 Others wonderful survey study that was put out a few years ago about seizure patients in clients perceptions, and how that actually altered the long-term prognosis of those patients. 16:37:45 So things like focus groups, case studies and surveys can absolutely provide good information. 16:37:50 Nothing causative, of course, but it can still provide good information. 16:37:55 So we'll go over some of these, and I apologize if you are putting a presentation together for Acbim for the case reports. 16:38:03 I recognize that I'm violating the rules in terms of wordy slides. 16:38:07 But I also, if people wanted a copy of the slides, I want to make sure that you have the information on there as well. 16:38:13 Animal I consider lab and animal models to be a jumping point for the start of negotiations which I know sounds like kind of odd. 16:38:22 But I grew up with the movie clueless. And so this is all that I think of. 16:38:25 They are extremely controlled, very. I mean, you have specifically bread, mice, and rats and other animals that live in very controlled environments, that nothing there's no compounding variables, because you've controlled for all of them. 16:38:40 So the information that you get is really really great. But as you're thinking, you're probably like, yeah. 16:38:45 But how many times do we have this promising? You know? 16:38:47 Study that says we haven't a new Alzheimer's treatment or a new treatment for cancer. 16:38:51 And then it's just bizzles out, and it never goes any where. And the reason is is that these kinds of studies do not target anything of our sample population right? 16:38:59 So if we want a big population, we want to find out genet factors in dachshunds that end up with a ascending, descending myoalaysia. I don't particularly think it makes sense to be studying mice these are great studies for proof, of concept though, so in my 16:39:13 mind. This is where a lot of research starts in that very controlled proof of concept way. 16:39:19 So this is not something that you'd be able to take and say, Oh, well, clearly, I can just give this to my dog, and it's going to be effective. 16:39:25 It's not technically generalizable. It doesn't mean it's not true. 16:39:29 It just means we can't take this information from a mouse model and generalize it to a population of dogs or cats. 16:39:35 The other reason is that the participants here may not have the natural disease state, and this is quite frequently why we don't see things translate over to human medicine when we're studying dogs, cats, my slap rats. 16:39:49 All of that so we have a randomized clinical control. 16:39:52 Our controlled trials, which also are typically going to be lead into our clinical trials these are prospective, which means that they've designed a study. 16:40:03 They execute the study, and they monitor the results as they're coming in. 16:40:07 It is an experiment where there was the variables, and confounding variables in theory are controlled. 16:40:12 They almost always have somebody degree of quantitative data. So you can crunch the numbers and give really good P values and statistics. 16:40:19 They are very valid with internal validity, meaning that your results are going to be very true to the study. 16:40:24 It's just whether or not you're going to see a lot of external validity again. 16:40:28 These are highly controlled and usually your target population does not live or exist in a highly controlled situations. 16:40:35 So there is some range for generalizability, but this is still what we look at when we're trying to determine the efficacy of a medication or a treatment. 16:40:44 And this is the gold standard. When we try to determine whether or not we want to change the way that we do medicine, or if we need to introduce new diagrams or treatment. 16:40:55 So this is how FDA Meds get approved, whether it's for veterinary or human. 16:40:58 This discovery or preclinical phase, probably going to be a mouse model. 16:41:03 I have a proof of concepts phase one, then, is going to be. 16:41:05 Is this safe? I'm going to give this to a bunch of healthy people hopefully, it doesn't kill them. 16:41:10 We shall see. You're going to give it to a bunch of your patients that have the disease. 16:41:16 You want to see if it's effective, and then, phase 3, you're going to see. 16:41:19 Is it more effective than what we currently have? And then it gets FDA approval. 16:41:24 And the one thing that I'll caution and I don't have my soapbox segway here. 16:41:27 But there is a fourth phase after it's FDA approved, and you have access to it through a prescription they're still collecting data, and they're still trying to determine how effective it is and how safe it is. 16:41:39 So that's why you sometimes see drugs get pulled from the market and it's because they were technically in that phase. 16:41:45 4 of clinical trials. 16:41:48 Crossover designs are really neat, because you can have your let's say in this case we have a bunch of beagles. 16:41:55 You have beagles in one group, then you have beagles in another group. 16:41:59 One gets a treatment, the other doesn't get a treatment. 16:42:03 You try and see how that, how your results played out. 16:42:06 You have a washout period, and then they switch. It's really good for trying to determine things like individual factors. 16:42:13 So, even though I might look the same as say, my brother or if I had a sister, we all would be, you know, obviously similar race. 16:42:20 We would be the same sex, we would be the same gender. 16:42:24 But that doesn't mean that we're the same. 16:42:26 So you can actually elucidate more in individual factors by using these kind of crossover studies. 16:42:32 These reports, this sounds familiar to those of you who have their Vts case reports coming up. 16:42:39 Now every academy is different. However, if you are coming into the aim bt, I will say that this is going to be a little bit different than what you are used, or what you are been guided to do for the or Vts application so in this situation it's usually very novel disease states. 16:42:56 Novel diagnoses, novel diagnstics, medications. 16:43:03 They're basically trying to say, this happened with this patient's. 16:43:07 And look, isn't this me? Again? It's observational. 16:43:09 You don't. You are paroding back what has happened in this narrative? 16:43:12 You are not actually manipulating that. So in this example this was a cat where the clinic actually established that non-convulsive status epilepticus does exist in the feline and actually technically found out later, also K 9 population. 16:43:27 So they weren't trying to say, this is causative. 16:43:30 They were trying to say, Hey, Bt. Dougs, this is a thing that we should really look into this. 16:43:34 Your case. Report for your Vts. Application is going to be similar in that you are going to be reviewing what happened with that case in a very methodical and scientific way. 16:43:44 I just wouldn't recommend doing something very complicated or very difficult that may kind of distribut the whole spirit of what that case report is so just, I wanted to make sure that that's clear. 16:43:56 And again, this is really good for trying to find out risk factors, diagnostic approaches, as in this article. 16:44:03 Maybe there's specific genetic factors. Like we were talking about here, or specific medications. 16:44:09 So the idea is that you're not gonna imply causation but it's one of those where you're like. 16:44:11 Oh, well, that's very interesting. I should keep my eye on this, and again it's kind of like a springboard into your next study, which could be a case series. 16:44:22 So again in this situation, we did that particular clinic went from hey? 16:44:26 We found a cat with non-comulsive status, epilepsy, as we'd like to see how often we're quote unquote, seeing if but missing it. 16:44:33 And so they actually published a case series as a follow-up. 16:44:38 And in this case, now it's kind of that bridge between. 16:44:39 Oh, I've got a single case into an actual sample size. 16:44:43 So again, it's observational. It can't necessarily prove causation. 16:44:47 There's probably a lot of other confounding factors, but it does also allow you, as the technician, if you're using a specific diagnostic or a treatment in your case, report that maybe it's not standard. 16:44:59 Maybe your specialist says, Hey, I like this protocol for Xyz reasons. 16:45:04 You can say that you know studies have shown or a recent case series showed that, you know electroencephalography has been able to identify you know, non-comulsive status and prognosis. 16:45:13 So this is a great type of thing. If you're trying to say, you know, this is why we did what we did, even if you don't have a you know, an experimental trial to be able to back that up cohorts again. 16:45:28 Still, observational, we're just going to track subjects over time. 16:45:30 These are very common, they can be very expensive in the human world, especially if you're tracking people over 50 to 70 years at a time. 16:45:39 But you're again. You're gathering events as they unfold and again you're trying to. 16:45:44 You're trying to answer. The question is which came first, the chicken or the egg, essentially? 16:45:48 So this is good for common disease states, as this article here shows that you know the age at neutering. 16:45:55 You know what kind of risk factor is there for becoming overweight or orthopedic injuries based on? 16:46:00 You know, when you were actually neutered. So most, you know, golden retrievers are common. 16:46:05 Those injuries, and illnesses are common, and obviously mutering is common. 16:46:08 So this is a great way, for if you have some a common disease to try and figure out what those risk factors are for that disease, and I'm across sectional is almost, I consider it like, if an animal's life is like a loaf of bread you're, taking a slice out 16:46:25 of it. You're just taking one little slice out, and you're looking at it, and you're saying, Okay, this is what's going on here. 16:46:30 So these are really good for better external validity. Studies. 16:46:33 So, because again, you're not trying to control necessarily for anything which again, we we lose the causation side. 16:46:40 But now, at least, I have a better understanding of what it looks like in my general population. 16:46:44 And again, these are really good for describing a disease treatment. 16:46:47 And again, that any risk factors associated with. 16:46:52 So again, primary research is really fun and cool but like I said, you can go down rabbit holes. 16:46:57 These are very limited in terms of what their, what information they're going to give you, but can give you very specific quote. I say, argument points. 16:47:05 Not that I expect that you would argue it, but very specific points in your case. 16:47:08 Reports or your presentations, or your writing secondary research. 16:47:13 I think, is significantly easier to read. I enjoy these more, and if there's secondary research on a topic that means that there's been a lot of research on it. 16:47:23 So a narrative literature review is probably the most common one that you'll come across, and so it's a really easy to point out, because typically it'll say, review in the title. 16:47:34 So you know, it's a review, and it's going to be very in-depth on a very specific topic. 16:47:38 This one here you can see the gastrointestinal microbiome it's a lot easier to read. 16:47:44 There's a lot less of the medical ease in your reviews. 16:47:48 Now with your narrative, reviews again, really good for summary updates in the field. 16:47:53 So if you're looking for an update on this disease or seizures or something like that, a review is a really good place to start, because it's easier to read it's going to have a lot of information. 16:48:03 And there's a lot of research that goes behind it downside. 16:48:07 And this is where me, I like control. This is not structured in any way, and although that doesn't sound like a bad thing, but without that, Imrad, these authors have no options to be transparent about what studies they've included in their review, and what they haven't so circled 16:48:26 back to the Hermeneutics, where I said, Try and get into the author's brain. 16:48:29 So everybody has a little bit of bias and opinion. I'm not saying that anybody intentionally gives bad information in a narrative review, but it's just something to think about. 16:48:39 That there's always when you, if they don't have to be transparent about what information they include, always be just a little bit suspicious, but just cautious. 16:48:49 Systematic reviews, so this is my baby. This is why there's a soapbox on this slide. 16:48:54 If you can find a systematic review on your topic, this is some of the best information you can find, because it is a review which means it's going to be easier to read. 16:49:05 But it's structured. The authors are going to be very transparent about what's studies. 16:49:07 They included which ones they excluded, and why? And then, being able to take all of that information together and give you one nice success hopefully. 16:49:18 Sixth article to read through, so again, it's great for a large amount of information. 16:49:23 We have a reduction in the reviewer bias because they have to be transparent about what information that they're drawing from, and a reduction of the concern for quality, meaning that typically in their methods materials and methods they will say like this was a bad quality study for xyz reasons so it 16:49:39 was excluded, so you know that they're going to be choosing good articles with that in mind with any author there is a person behind it. 16:49:47 There can always be a bias. But this is going to be some of the best that you can get, and same thing with meta-analysis. 16:49:54 The only difference between a systematic review and a meta-analysis is a meta-analysis is like an analysis of an analysis like it. 16:50:02 You're able to take that sample size that existed in one study, and you can bring that together with all the other studies. 16:50:08 And now your sample size is enormous, and the bigger your sample size, the more accurate your study is going to be. 16:50:14 So these types of studies are extreme. Probably, again, nothing is a hundred percent. 16:50:21 But these are going to be where you're going to find the most accurate information so if you want a secondary review and so that it's easier to read, you get a lot of information, it's going to be the most accurate systematic review, or meta-analysis or both like this 16:50:33 article is here. 16:50:36 So I've gone over a lot again. There's tons and tons more out there, but when to use what your primary research is going to be good, if you are writing in your case report that you know we did Xyz, because, according to a recent study, you're probably just saying one study and 16:50:53 that is this study. If you're saying several studies have shown X. 16:50:57 Yz. You're making an argument with a specific point. 16:51:00 But now you're saying that it's not just one study. 16:51:03 They're multiple. There's a lot of information out there and that's when you're going to want to use the secondary and really just leave the for the basic background information. 16:51:11 Because that's really you're just talking about basic facts at that point. 16:51:14 So, you know, with your primary, it might be like, Oh! 16:51:18 A single study showed that there was this factor in the genes of dachshunds which made them more likely to extruded this. Your T-shirt is gonna be talking about what is handsome type one on a basic anatomical scale. 16:51:30 I don't know if you can screenshot, but if you can screenshot it, if you can't, just look up the hierarchy of metaphorical evidence and then click images in your Google bar. 16:51:43 This shows you the level of quality of evidence that we use to make evidence-based decisions, animal and lab studies are at the bottom. 16:51:52 Now it says here, not involving humans. But if you think of it from the veterinary perspective, if it doesn't involve your target population like dogs and cats, or exotics, whatever it is, then it's still in that same boat. 16:52:06 The other thing that we see here than your case reporting case series. 16:52:09 There's no design. So how can there really be good quality of evidence? 16:52:12 And I, we were gonna highlight, this bad manager right here. 16:52:17 Expert opinions. So that's true. Expert opinions are down at the bottom of the pyramid. 16:52:21 Ladies and Gentlemen. Case control studies, cohort studies. 16:52:25 Up here. We start getting real good with our experimental studies. 16:52:28 And look at this. Here's my meta-analysis and systematic reviews. 16:52:32 These are really high evidence, and then you usually will have agencies like the Cdc National Cancer Society. 16:52:41 Things like that. They can then put it altogether. Practical guidelines. 16:52:46 The Cdc National Cancer Society. Things like that. They can then put it altogether as practical guidelines. 16:52:49 But for your purp again, if you're trying to form an opinion or sway an opinion, those are going to be the best. 16:52:53 If you're just going to give some general information coming down here is perfectly fine. 16:52:58 So? Why am I holding on expert opinion? And it's because well, everybody has an opinion like they say our opinions are going to be different from us to person, from topic to topic, and it's going to change based on your formal educational background you can even see this between veterinarians that went 16:53:17 to 2 different universities, what are their experiences in the past? 16:53:22 I've never had a problem when I've had a patient on Aanda Dean, but there's another veterinarian I met that absolutely hats it. 16:53:29 They think it's a dangerous drug cultural background and values shouldn't necessarily affect medicine, but it can absolutely affect their opinions and their perceptions and their biases. 16:53:39 And then the perceptions, where are we? Where are we going? 16:53:43 Which, again, is it going to be influenced by our past and our culture? 16:53:46 Expert opinion pieces are written by authors, with no exceptions to anything of the above. 16:53:51 So when I say, narrative reviews that have no structure, it is an expert. 16:53:56 I'm not saying they're not experts, but, more importantly, as you said, my little cartoon, veterinarian up there our veterinarians are experts. 16:54:04 You listen to them. We function under their license. So you don't argue with them. 16:54:09 Don't! Don't go back saying, like, Oh, Christine said that veterinarians expert opinions are crap! That is not what I'm saying at all. 16:54:15 What I'm saying is that they're very susceptible to bias. 16:54:19 So when you're writing your case reports for your for your bts, application your your neurologist, your internist, your oncologist or cardiologist. 16:54:30 They are not references that we want to see in the report. 16:54:33 I want you guys to independently verify facts and not just take things at face value. 16:54:40 You should be researching all of the aspects that go into your case reports to make sure that you understand them at a deeper level, and not just based on what your veterinarian is telling. 16:54:48 I mean no insult to any veterinary. You are experts. 16:54:52 We can't do anything without you guys, but just understand that you guys that veteranarians do find into that expert opinion category which is at the bottom level of the justifiable. 16:55:02 Opinion or belief, go to Google. Resources, good resources, good resources, good resources, peer review journals again, jam is a phenomenal one that you can go to Hub. 16:55:16 Met is good. Google scholar, if you must, don't just go to Google. 16:55:23 And then the Cochrane Library. Cochrane Library is really primarily human health, but it's just nothing but reviews. 16:55:29 And it's amazing. It's a great place to find secondary literature. 16:55:33 That's actually a good book. And of course, your reputable textbooks. 16:55:37 Here's my soapbox again, ladies and gentlemen, I swear the Google machine is not a resource I can't tell you people not in the veterinary industry trying to say, but I don't like that's not a thing thing in your references on a case. 16:55:54 Report anybody trying to sell you something, throw it out. I don't care how good it looks. 16:55:59 I again I apologize to Peter. Lovers out there. 16:56:01 Peter is one of the worst. They're always trying to sell you something. Nothing. 16:56:03 They actually say, is of any actual evidentiary value, and then, of course, social media do not get your information off of social media that just comes back to everybody's opinion. 16:56:15 You know what they say about opinions. Everybody has one. 16:56:18 Briefly the crap test. This was developed in California State University. 16:56:24 Where how do we evaluate the information that is ahead of us? 16:56:27 We are going to look at currency, relevance, authority, accuracy, and purpose how timely is this article! 16:56:34 Remember for your case. Reports it needs to be 10 years or newer again, half flight of knowledge. 16:56:39 Is it relevant to your needs? I, if it doesn't have anything to do with your case report, you really shouldn't have anything to do, feel free to read it and have fun with it. 16:56:48 But you know, for your case report this probably isn't going to help you. 16:56:50 Who wrote it? Who is the research? What is the what is the source? 16:56:54 How accurate is it? Maybe a little bit difficult again, we're not experts on evaluating everybody's materials and methods. 16:57:02 And then, why was this done? Why does this information even exist? 16:57:05 What was the impetus for this author and these researchers to do this? 16:57:09 So again. When was it published? How recent is it is it even relevant to your topic? 16:57:14 If I have to reference something from the 19 seventies that's not a good day like you should find something more recent than that. 16:57:21 And also, if it's a website, when was the last time it was updated? 16:57:27 What is the relevance? That is it relevant to your topic? 16:57:31 I don't have an image for this, because I kind of feel like there's just so much variation. 16:57:34 If you're talking about this disease, find something on this disease. 16:57:38 If you're talking about a cancer make sure it's the right cancer. 16:57:40 Would you be comfortable using the source on a regular research project? 16:57:46 Who is the researcher on this? So you, these articles should be very open about where these people work if these people don't look like they have any authority to be talking about Eeg neurology or animals that maybe take a step back maybe that's not the right. 16:58:04 Research for you know what are their credentials in their affiliation. 16:58:07 And again, if it's a website, does it reveal anything about them? 16:58:09 Again. Does this come back to somebody trying to sell you something? 16:58:16 And again make sure that there's at least a good reference section. 16:58:19 You can see that this one is up to 77 references in this one article make sure that they have referenced every single claim that they are trying to make if they are unable to do that, or they cannot provide references I would not suggest using that and again, make sure if the website 16:58:33 will that everything on there actually works all the links, any email addresses contact, info, and things like that. 16:58:41 And they should also be very open as to why they even bother to do this. 16:58:46 So again, do they make their intentions clear. This one's a great one. 16:58:50 The purpose of the study was to characterize the associations between Gonadectomy, neutering, and 2 outcomes, so again do they make their intentions clear. This one's a great one. The purpose of the study was to characterize the associations between not that hey? 16:59:04 We wanna make sure that you buy extra email if you know, if you neutered your golden retriever at too young of an age. 16:59:11 So, even with all that, even if you're comfortable, with what resources you have just keep in mind that it's always good to not only keep an open mind, but also just a little bit skeptical. 16:59:20 Right like I said. Biases are always possible, and they're not always intentional. 16:59:26 You know pharmaceutical studies, for example. Their whole job is to sell you a drug. 16:59:31 So these randomized, controlled studies are really good. But keep in mind that, for FDA approval all they really need is 2 studies that show that it's effective. 16:59:40 They could run 10,000 studies, but as long as 2 are effective, so their push is gonna be a little bit different. 16:59:46 How you sample your your sample, sample. 16:59:51 Your sample, you know, is this a volunteer basis? Are only oh, pet owners that have the luxury of being able to take off work to bring their dog in. 17:00:00 Are those the only ones that you're going to accidentally sample for? 17:00:02 Because those of low socioeconomic status may not be able to bring their dogs in regularly. 17:00:07 Is there a publication bias where hey? I didn't get the results. I wanted? 17:00:12 So I decided not to publish it as well as reviewers. 17:00:15 Oh, bias so random! Chance also, like I said, your p-value is never going to be 0. 17:00:20 It's never so like we said here, you know your chances of getting killed by a group of bab ducks is very, very low, can't imagine a better or worse way to go. 17:00:31 And again confounders incidental co-variation. 17:00:35 All those things are really hard to eliminate, especially from a single study that's why I like my systematic reviews. It's not a single study. 17:00:41 I'm reading. It's several things, and these things become more apparent when you're able to compound all of this information. 17:00:48 So with that, I wanted to just provide a real-world example. I did do this last year. 17:00:53 I am not selling or trying to say anything about Iber. 17:00:56 Mectin and COVID-19. I am simply using this as an example. 17:01:01 These are 2 systemic reviews. They are both from very reputable sources. 17:01:06 Here in here, and they both are very timely. So if we do, for example, our crap test the currency. 17:01:14 This one's from 2021. This one is from 2020. 17:01:17 They are both relevant. They're both talking about efficacy of ivermectin and covid. 17:01:22 -nineteen treatment their authorities. They were both from facility that do medical reach. 17:01:28 They're accuracy. I can't account for because I'm not an expert, and their purpose is obvious. 17:01:34 Like? Do we use that Ibermectin or not? 17:01:35 Both of these studies, despite being reputable and kind of passing the craft test came up with 2 completely different conclusions. 17:01:43 So that's why, sometimes, even though we want to jump on the most exciting new thing. 17:01:48 Sometimes we have to wait a few years. That half life knowledge kind of has to catch up as we continue to study whether it's Ibermectin in Covid-nineteen or specific treatments for our dogs and cats and diagnostics. 17:02:00 Sometimes it takes a few years because we have to test and challenge and keep learning about these subjects, so we can eliminate those compounding variables. 17:02:07 We can try and eliminate the biases. And so the key takeaways. 17:02:11 Science is a process. Some say it's self-correcting, but I actually think it's active participation it's not going to correct itself. 17:02:18 We have to get in there and read it and learn about it and challenge it. 17:02:22 Relationships are difficult. So don't just take anything at face value for, especially if someone says, X causes y different types of literature are going to suit different needs. 17:02:32 Secondary reviews are my perfect or secondary literature is my favorite and systematic reviews. But sometimes I want to go to the primary source. 17:02:41 The main goal is to answer a question and to be able to generalize that answer to the population. 17:02:46 Did we? What was the article able to establish that and expert opinions like our veterinarians, we are guided by them, but they are not the end. 17:02:54 All be all of the justification of why we do what we do. Do. 17:02:58 The research and justify why you did what you did in your case. 17:03:01 Reports, and again remain open minded, but also skeptical, and with that oh, I'm 2 min over, maybe even more. 17:03:10 I apologize, but if anybody has any questions, I am open for questions. 17:03:22 And I can also stop sharing if that's what you guys would like. 17:03:25 I love that picture of you lecturing. 17:03:28 I. It's it's super fun. I actually just found this picture on Friday. 17:03:33 It was from a talk I gave in sand. Jose and no idea what I was talking about. 17:03:38 Or pointing out, but I seem very pensive, right like it seems. 17:03:40 Yeah. There were no questions in the chat that it's such a fantastic presentation. 17:03:50 We were. Thank you so much. I am so glad you came to give this tonight. 17:03:55 Absolutely absolutely. And if anybody is, if they, if you want a copy of the slides, or if you just want kind of a gylleted kind of rundown of what we talked about, questions that come up later, please feel free to reach out I'm a very reachable person. 17:04:10 I I love talking about things. I'm passionate about. 17:04:14 I just spent a weekend mountain biking, and somebody said statistics, and I went on a 1 h Rant. 17:04:19 So I mean, I'm the stuff I'm happy to help and get as involved as you guys are interested that I just draw a look at this. 17:04:25 Perfect. 17:04:28 Thank you so much. I'm going to let's see cam.