How to give and receive a good design critique
Article by Kate Lawless and Shannon Crabill
Filed Under: Inspiration , professional development , Article , advice , collaboration
This story was originally published by AIGA Baltimore.
Why is critique so important?
As designers, we don’t design in a vacuum. A good designer will need to learn to take the feedback from their peers, clients, and bosses to solve a particular design problem. Critiques will also help you broaden your communication skills as a designer, as there is always the opportunity to articulate why you did what you did or to better explain your idea to the reviewer if they don’t see it as clearly as you do.
A good critique can involve both positive and negative feedback, which can be tricky to navigate. Here are some quick tips on how to give–and receive–good design feedback during a critique.
How to give a good critique: The love sandwich
The best way to approach critiquing someone else’s work is to sandwich the feedback with love. If you think of your critique as the sandwich, the bread would be what you “love” about the work and the middle—the fillings—would be what you didn’t like as much.
First, tell your fellow designer what aspects you like about the piece, whatever they may be. Be descriptive. Instead of just saying “I like it” explain why you like it while using specific examples from the design whenever possible.
Next, move onto the constructive criticism. If you think certain aspects of a design aren’t working, try to explain why or offer suggestions on how they can be improved. Asking the designer questions may help them to see problems in the execution of the design that they may not have seen on their own.
You may also want to limit your use of personal pronouns, like “you,” to make sure your critique is about the design work and not about the designer. We all feel personally about our work, but during a critique it’s best to separate the person from the piece. For example, say you have a critique about a line intersection. You may want to say, “The way this line intersects with that line,” instead of “The way you intersected this line with that line.” This will help reassure the designer that the criticism is about the work and not about them, as designers.
You don’t have to agree or like the decisions of the designer but their work deserves honest feedback. Put yourself in their shoes. If they are brave enough to share their work and ask for feedback, then they deserve to get that, both the good and the bad.
Finally, don’t forget to repeat or elaborate on what you liked about the piece so that the critique ends on a positive note. This way, the designer knows the piece may need some reworking, but also that there are aspects of the design that work as-is, too.
How to receive critique well: A grain of salt
Hopefully, your fellow designer will follow the Love Sandwich guidelines and give you a great, honest critique. During a critique, It’s important that when you hear the good and the bad feedback to take it with stride. Design isn’t math. There are no right and wrong answers; only subjective opinions that may differ from one designer to another.
That being said, remember that a critique is about your work and making it the best it can be; it shouldn’t be about you. If you disagree with specific feedback, explain your decisions thoughtfully but also listen to what’s being said. Remember, those who are giving critiques generally do so because they want to help you grow as a designer, so try not to get defensive or take their criticisms personally.
And, if you don’t agree with specific comments you receive during a critique, it’s okay to ask for other opinions, too. Baltimore is filled with great designers who are willing to help and who love to give a good critique. There are also online resources like Dribbble or Behance that you can log into and share your work with others around the globe. Anyone, even a non-designer friend or coworker whom you trust to give honest and constructive feedback, can be a good resource. And, a good round of feedback is always better than no feedback at all
Image by Shutterstock
Critique of a Research Article
The goal of this activity is to give you an opportunity to apply whatever you learned in this course in evaluating a research paper. Warning!!!!You might have done some article summaries or even critical evaluation of some resources. However, this activity is unique because you evaluate a research article from a methodology perspective.
For this assignment you briefly summarize and extensively evaluate the attached educational research article (If you cannot download the article please go to BeachBoard/Content/Articles to download the article).
This assignment should be done individually. In the summary section, you should write a brief (up to 500 words) summary of the article in your own words. Don’t use copy and paste try to rephrase. This will be a good practice for your final project’s literature review. In the critique section, you evaluate the article using the following grading criteria.
Grading criteria for research critique
In your summary, you should identify main elements of the research including
1. Research problem
2. Research goal
4. Research Questions
5. Research Method (briefly explain)
6. Sample (participants)
8. Tools (instruments, tests, surveys)
9. Main findings (brief summary of the results)
The critique part should be 2-3 pages (1000-2000 words) and include to the following sections. Your critique should be longer than your summary and you pay special attention to the design and procedure. Your grade on this assignment is based on your answer the following questions.
There is a long list of questions. You don’t have to address all questions. However, you should address highlighted questions. Some questions are relevant to this article some are not. I listed so many questions simply because I’d like you to learn what to look for in evaluating a research article.
The format of your paper should NOT be like a Q & A list. Instead, you should integrate your answers into an essay format similar to the given examples.
1. Is there a statement of the problem?
2. Is the problem “researchable”? That is, can it be investigated through the collection and analysis of data?
3. Is background information on the problem presented?
4. Is the educational significance of the problem discussed?
5. Does the problem statement indicate the variables of interest and the specific relationship between those variables which are investigated? When necessary, are variables directly or operationally defined?
Review of Related Literature
1. Is the review comprehensive?
2. Are all cited references relevant to the problem under investigation?
3. Are most of the sources primary, i.e., are there only a few or no secondary sources?
4. Have the references been critically analyzed and the results of various studies compared and contrasted, i.e., is the review more than a series of abstracts or annotations?
5. Does the review conclude with a brief summary of the literature and its implications for the problem investigated?
6. Do the implications discussed form an empirical or theoretical rationale for the hypotheses which follow?
1. Are specific questions to be answered listed or specific hypotheses to be tested stated?
2. Does each hypothesis state an expected relationship or difference?
3. If necessary, are variables directly or operationally defined?
4. Is each hypothesis testable?
1. Are the size and major characteristics of the population studied described?
2. If a sample was selected, is the method of selecting the sample clearly described?
3. Is the method of sample selection described one that is likely to result in a representative, unbiased sample?
4. Did the researcher avoid the use of volunteers?
5. Are the size and major characteristics of the sample described?
6. Does the sample size meet the suggested guideline for minimum sample size appropriate for the method of research represented?
1. Is the rationale given for the selection of the instruments (or measurements) used?
2. Is each instrument described in terms of purpose and content?
3. Are the instruments appropriate for measuring the intended variables?
4. Is evidence presented that indicates that each instrument is appropriate for the sample under study?
5. Is instrument validity discussed and coefficients given if appropriate?
6. Is reliability discussed in terms of type and size of reliability coefficients?
7. If appropriate, are subtest reliabilities given?
8. If an instrument was developed specifically for the study, are the procedures involved in its development and validation described?
9. If an instrument was developed specifically for the study, are administration, scoring or tabulating, and interpretation procedures fully described?
Design and Procedure
1. Is the design appropriate for answering the questions or testing the hypotheses of thestudy?
2. Are the procedures described in sufficient detail to permit them to be replicated by another researcher?
3. If a pilot study was conducted, are its execution and results described as well as its impact on the subsequent study?
4. Are the control procedures described?
5. Did the researcher discuss or account for any potentially confounding variables that he or she was unable to control for?
1. Are appropriate descriptive or inferential statistics presented?
2. Was the probability level, α, at which the results of the tests of significance were evaluated,
specified in advance of the data analyses?
3. If parametric tests were used, is there evidence that the researcher avoided violating the
required assumptions for parametric tests?
4. Are the tests of significance described appropriate, given the hypotheses and design of the
5. Was every hypothesis tested?
6. Are the tests of significance interpreted using the appropriate degrees of freedom?
7. Are the results clearly presented?
8. Are the tables and figures (if any) well organized and easy to understand?
9. Are the data in each table and figure described in the text?
Discussion (Conclusions and Recommendation)
1. Is each result discussed in terms of the original hypothesis to which it relates?
2. Is each result discussed in terms of its agreement or disagreement with previous results
obtained by other researchers in other studies?
3. Are generalizations consistent with the results?
4. Are the possible effects of uncontrolled variables on the results discussed?
5. Are theoretical and practical implications of the findings discussed?
6. Are recommendations for future action made?
7. Are the suggestions for future action based on practical significance or on statistical
significance only, i.e., has the author avoided confusing practical and statistical
8. Are recommendations for future research made?
Make sure that you cover the following questions in your critique even if you have already covered them in your crtique.
1. Is the research important? Why?
2. In your own words what methods and procedures were used? Evaluate the methods and procedures.
3. Evaluate the sampling method and the sample used in this study.
4. Describe the reliability and validity of all the instruments used.
5. What type of research is this? Explain.
6. How was the data analyzed?
7. What is (are) the major finding(s)? are these findings important?
8.What are your suggestions to improve this research?
Here is a hint on how to evaluate an article.
Use this resource for writing and APA style.
Examples (please note some examples are longer than what is expected for this article)
· Good example
· Poor example
· Original article
· Article critique