Communicating Results

Communicating Results

If a scientist does an experiment, but doesn't share his results, did he actually do the experiment? Philosophical questions aside, the answer is basically no. Sharing is caring in the science world. It's how we keep improving our understanding of how stuff works and prevents us from having to reinvent the wheel every single time we want to learn something.

Can you imagine if scientists had to rediscover gravity every time they wanted to study velocity? Or figure out the double helix model of DNA every time they wanted to sequence a gene? Luckily we don't have to, because the people who first discovered that stuff shared their findings with the rest of us.

How do scientists keep each other in the loop on their latest discoveries? Hint: they don't use Snapchat. Instead they go through a peer review process. This is sort of like in English class when we trade essays with another student and scour each other's work for spelling or grammar mistakes and give feedback on the awesomeness of their Macbeth analysis.

The science world is a little different than English class (there's less gum under the tables, for one), so here's the general idea of how scientists put together a scientific paper.

  • Background on the Topic: This is where we talk about what we already know and give an overview of the questions our experiment is trying to answer. We know, it kind of sounds like a snooze-fest, but it gives the reader some background knowledge in case it's not their area of expertise. Being a scientist doesn't mean you know everything there is to know about elephant seals, elastic polymers, and the effects of greenhouse gases on field mice. It also helps give the reader some context about the questions we were trying to answer and what we were trying to accomplish with our experiment.
Let's say, for example, we did an experiment where we placed different fruits into bags with an unripe avocado, to determine which fruit will ripen the avocado faster (we had a serious guacamole craving). Our background on the topic would include information we already know about what causes avocados to ripen, the appropriate chemistry behind ripening fruit, if the ripening process is affected by heat or cold, etc. We would also want to include our scientific question and hypothesis, as well as explain our motivation for performing the experiment (guacamole, duh).

  • Methods: This section is the how-to for our experiment. Since other scientists may want to give our experiment a whirl, we need to be as detailed as possible, from what size beakers we used to how many times we swished the solution around before we put it on the hot plate. Why do we need to include so much detail? Imagine trying to teach someone how to tie their shoes in a letter. "Loop the one side around the other and pull the bunny ears," isn't gonna cut it. Scientists who are peer reviewing our experiment will also want to know exactly how we got our data, so a detailed description of our methods is crucial to backing up our findings.
If we head back to our avocados, this is where we describe how many avocados we used, how we selected them, their size and mass, which fruits we used, how we selected those, what kind of bag we placed the fruits and avocados in, where we let them hang out, how long we let them hang out, and so on. We want to give our readers all of the itty-bitty details so they can understand what we did (and then do it for themselves if they want to).

  • Results: Drum roll please! This is the moment we've been waiting for, ever since we said the "Hmmm" that got this whole process started—the results. Here we want to talk about the data our experiment spit out. Of course, not every scientific study is all about the numbers, so if all you've got are qualitative data, feel free to share those here too. Keep in mind this section is just for sharing what happened during our experiment. We aren't getting into the details of what all those numbers and observations mean, yet. Wait for it…
In our avocado experiment, we'll want to talk about how we determined ripeness. Then we can show our data for how long it took for an avocado to ripen by itself and how long it took an avocado to ripen when it was paired up with a fruit. We might need a graph; we definitely need chips.

  • Discussion: Okay, now we can talk about what we think our data are trying to tell us. We'll want to tie our findings into the stuff we discussed in the background information section. Did we answer the questions we asked? What new questions did our experiment bring up? How do our findings impact our overall understanding of the field we're studying, and science in general? We want to understand how our results affect the big picture here. We'll also want to give some suggestions for improving on our experiment (nobody's perfect), as well as ideas for future studies.
Let's wrap up this avocado experiment so we can get to the fun part (eating them). We'll need to look at our data and see if one of the fruits we used actually made the avocado ripen faster. Then we'll want to talk about any additional questions we may have. For example, does sunlight affect ripening? Do avocados ripen faster in a paper bag or a plastic bag? Where is the salsa?

We also need to talk about how our findings affect the scientific community. Maybe our findings can be used by grocery stores and restaurants that need ripe avocados, stat. Lastly, we'll want to talk about how we can improve our experiment. Maybe we can come up with a standardized way to measure ripeness, since "poking the avocado" isn't very scientific. Or maybe we find a way to put the avocados in a room with controlled temperature and light. These suggestions will help the next scientist who does the experiment make it just a little more accurate.

Putting Our Science In the Spotlight

Next step: submit our paper to be published. Scientists don't throw their results on their Tumblr and call it a day. While there's nothing wrong with Tumblr for sharing that deep quote about life being like a washing machine, it's not really where we go for legit science information. We'd prefer that our scientific studies be reviewed by, well, scientists. Crazy, we know.

Luckily there are a whole bunch of scientific journals that publish studies done by scientists. Before that study can make it to the printing press, though, it needs to be reviewed by several peers, or scientists who work in the same field. They'll read it, reread it, and read it again to make sure there are no errors, bias, or unethical tomfoolery, and that it is detailed and clearly written.

This process is like a gatekeeper, weeding out the not-so-good studies and only letting the really exceptionally good stuff through for the world to see. In fact, some journals are so persnickety, they publish less than ten percent of the studies that get submitted to them. Thankfully, our English teacher wasn't that tough.

So, what does a report need to have to pass the test? First of all, the experiment should be well designed, logical, and attempt to advance our knowledge of the topic. Sorry, throwing apples at the ground to prove gravity exists isn't really helping anyone at this point in time.

Any claims that are made should be supported with evidence from the experiment. This evidence has to be the good stuff. It can't be made up, it can't come from faulty equipment, and it can't be someone else's. Keep in mind that science is about building our knowledge, which means we might need to use someone else's research as our foundation. That's cool, as long as we make sure to give the other scientist props for their contributions and don't claim them for our own.

Lastly, it needs to be well written. Yes, even if we've got our hearts set on working in a research laboratory, we still need to pay attention in English class (and no, your English teacher did not pay us to say that). Reports should be clear and concise, while still including the necessary details. We obviously want to avoid spelling and grammar errors, as well. These make our reader wonder: if we don't know the difference between "there" and "their," what else are we getting wrong? Even if we have a really awesome, perfectly performed experiment, our report can still be rejected for publication because it wasn't written well.

So what if it doesn't pass the test? That happens. Actually, it happens quite a bit, because those science journals are super picky. And it's not the end of the world. If a scientist gets a less than stellar review, they can use the feedback they received to go back and improve their experiment or their write-up of the experiment. Just like in our own peer review, where we can go back and revise our English paper before turning it in. Then they resubmit it and the whole process starts again.

The End Result

Even if a study does pass the test and is published, scientists still aren't lining up to have it tattooed on their arm as truth. Instead, those ever-skeptical scientists will make their own attempts at performing the experiment to see if they can get the same outcome. Depending on the experiment, this can take a really long time. However, repeating an experiment and getting the same results is just another way we can make sure the original results are trustworthy. And if we don't get the same results, well, we know that something isn't right and we're glad we didn't get that tattoo.

Peer review can be pretty time consuming, but it's important to the scientific community because it makes scientific studies more trustworthy. A scientist would much rather trust the data from a study that earned a gold star from other scientists than something on some guy's blog he writes from his parent's basement. Just like we'd rather eat at the restaurant with a hundred five star reviews on Yelp than the restaurant with only one review that looks like it came from the chef's mom.

Practice, Practice, Practice

While our high school heart rate lab isn't likely to get published in the New England Journal of Medicine, that write up isn't totally worthless. It gives us valuable experience in communicating scientific findings just like real scientists. So really, we were just doing what Steven Hawking does, only on a slightly less sophisticated level. And we thought our teacher was just trying to torture us…

We've given you the rundown on how scientists usually communicate the results of their experiment. Does it always happen that way, though? Not necessarily. How scientists decide to communicate their findings depends on their experiment and whom they want to inform. If they've discovered a new cure for a major disease, a scientific journal is the way to go.

Scientists can also share new discoveries while giving a PowerPoint presentation at a conference, or by writing a book on their findings. If they discovered that a hummus facemask made their pores 6% smaller, well, that might be best shared in a quick email. We probably don't need to share that one over lunch, though.

Common Mistakes

A major mistake scientists make in communicating is not being clear in how they conducted their experiment. Remember that other scientists may try and repeat this experiment, so if our procedures are confusing, they may as well be written in Sanskrit. Don't believe us? Write out the instructions for making a peanut butter and jelly sandwich and have someone who's never made one follow them. If we don't write out each step clearly and in detail, we may need to order in lunch.

When we're writing our procedures, we should be as clear as possible, include pictures or diagrams when we can, and have someone else read it to make sure it makes sense to someone on the outside of our heads. If your lab procedures are less like a glass of water and more like a cement smoothie, head on over to the Shmoop Essay Lab and we'll help you clear things up right quick.

Some More Common Mistakes

We're constantly getting smacked in the face with scientific studies making all sorts of claims, like shampoo causing cancer, or that cramming for a test is more effective if you've had a Red Bull. The mistake is for us to blindly believe the news reports on these studies without knowing all the facts.

Unfortunately, we rarely hear the important details after somebody says the  phrase "studies say." Was that shampoo study performed on mice or humans? Was the Red Bull study funded by Red Bull? Did they use appropriate science techniques and sample sizes? Did they analyze their data correctly and without bias? Who knows? By the time we hear about it on the news, it's already been generalized into an eye-catching headline that we can impress our friends with at a party, while the science stuff is buried in a drawer somewhere.

In the science world, scientists would never, ever, ever accept the results of a single study as fact. First they'll scrutinize how the study was performed and the data that were analyzed; then they'll wait for confirmation from other studies. Even then, they'll always be giving it the side-eye. Science is a long, slow process that requires a lot of double, triple, and quadruple checking before new conclusions are accepted as valid. So, don't believe everything that comes after "studies say," unless it's, "Studies say students who use Shmoop are awesome." That's totally legit.

Just kidding. That was a trap; we made that up to keep you on your toes. We do think you're awesome though. Stay skeptical, friends.

Brain Snack

In 1989 Martin Fleischmann and Stanley Pons claimed they discovered a new energy source through the process of cold fusion. In case fusing atoms isn't one of your extracurricular activities, cold fusion is when two atoms squish together to make a heavier atom at room temperature. Regular old fusion usually requires center-of-the-sun temperatures.

Of course the scientific community was pretty pumped about this. Lots of cheap energy at room temperature? Sign us up. But as Fleischmann and Pons' experiment went through the peer review process, the fist pumping died down. Their report lacked many important details and other scientists were unable to replicate their results. As the data were examined more closely, scientists noticed there were many errors in the procedure and data that were collected.

In short, Fleischmann and Pons' experiment was a hot mess with no cold fusion. However, without the peer review process, we'd still be trying (read: spending a lot of money) to make cheap energy from a process that doesn't work. So there's that.9