January 6, 2025
From the editor
Hello, all! There’s something we’ve been mulling over here at the AJS, and I thought this might be a good time to solicit your input. The great strength of the AJS is of course its reviewing community. Every year, we get reviews from around 1000 distinct reviewers. And that takes a great deal of work. It’s very easy to find the first 100, but the second 100 is tougher, and by the time we’re getting to the tenth…it’s a real problem. Even so, I think it is of great importance for us to increase our reviewer pool. This would, among other things, speed up our turnaround for submitted papers, and this has long been our chief goal.
In particular, it is my belief that there are early-career scholars all across the globe who would be eager to review for the AJS. (I admit I don’t have any hard evidence for this belief, but there it is.) That’s because sociologists all over are oriented to the US system. I think that, overall, this isn’t great – it would be better if there were more independent systems that could then be in dialogue with one another, as opposed to hiring and promotion in many countries being driven by candidates’ acceptability to the US system. Still, that’s the way it is, at least now. We’ll see if machine translation changes things (it could allow the flourishing of multiple, interacting but somewhat autonomous, fields, but it also could increase the stampede to the US-centered one).
So I think there are early-career scholars who have good reason to be oriented to mainstream US sociology, and I think we would profit from their input. True, it can take the board more work to contextualize reviews from inexperienced and/or early career scholars, as they are usually lacking an intuitive feel for the distribution of possible papers, and often may be more oriented to the negatives than the positives in a manuscript, but they also often give extremely detailed reviews. And they would help lighten the load on the US reviewers because, as it stands, right now, there is a strange imbalance – at the AJS we find we are asking US-based sociologists to review work by scholars outside the US more often than the reverse.
So why not do it? The answer is, early career scholars are in general hard to find. Not so hard if they are graduate students at central departments, especially if they have co-authored with more advanced faculty. But when they are outside these departments, and haven’t yet published in US sociology journals, we don’t know how to find them. Many journals simply let people volunteer themselves as reviewers, and we can do this…but we still won’t find them in our database. That’s because, as a general journal publishing articles that usually speak to more than one subfield, the sorts of “tags” (e.g., knowledge, gender, stratification) that might be used to identify reviewing competence in a more highly structured discipline just aren’t very informative.
Some journals find a special resident editor (an “International Editorial Board”) for each country they expect to get contributions from. The advantages are obvious—this editor will understand the relevant university system and the types of work groups in each. We are impressed with journals that have taken on the challenge of a global reviewing pool. The potential disadvantage, however, of giving one person total gatekeeping control over a country’s access to the mainstream seems to us to outweigh the advantages. So we are looking for another way.
We haven’t found it, and so we turn to you for your thoughts and suggestions. We do have one idea: when early career scholars volunteer to review for us, we ask them, “what published paper is just the sort of thing you’d—if all goes well—do?” And perhaps that scholar replies “I’d like to write the next Swidler (1986),” we then associate their name with the name of the first author (Swidler) on that paper. So when we are looking for someone to a review a paper, and we think Swidler would be a good reviewer, we search and get not only Swidler, but this scholar.
That’s pretty crude, though it doesn’t seem very hard to implement nor to have an obvious bias. But we’re still looking for other ideas. If you have any, or want to comment on this general project, we’d love to hear from you. So here’s to another new year!
John Levi Martin
Editor
____________________________________
Previous Letters:
August 2024
From the Editor
Hello, all! We’re halfway through 2024, and this is a good time to take stock of developments in the field. We’re very pleased to see continuing pushes for increased rigor and methodological sophistication, and we want to discuss three in particular. So this letter will be longer than my previous.
First, a committee spearheaded by the National Academy of Sciences has released a set of strong recommendations for standards relevant to the construction and use of survey research. The document can be found here (https://www.pnas.org/doi/10.1073/pnas.2406826121).
They summarize these standards/principles as follows:
- Improve disclosure of sampling design, modeling, and weighting assumptions for all surveys—probability and nonprobability alike.
- Disclose question wording and order.
- Improve disclosure of respondent recruitment and question-related panel conditioning factors.
- Disclose known and expected consequences of attrition on panel surveys.
- When survey data are weighted, the phrase “representative sample” should not be used without explicit acknowledgment of the underlying assumptions, including disclosure of weighting and modeling used. Survey vendors should not release data without this information, and publishers of content that use survey data should publicly commit to cite or use data only from sources that provide this information.
These are eminently good ideas, and we endorse them heartily; we encourage authors to use them when writing their manuscripts, and reviewers to take them into account.
Second, we’ve been mulling over the increased importance of preregistration in sociology. First, as Brendan Nyhan has pointed out (https://bsky.app/profile/brendannyhan.bsky.social/post/3kecol3gx5x2c), reviewers often imagine that preregistration means that one has called a coin toss in the air and do not necessarily examine preregistration documents to determine to what extent the submitted paper does in fact follow those guidelines. This can lead the mere fact of preregistration to be less important than it might seem. One doesn’t have to call a hypothesis in advance for preregistration to be useful—explaining how coding and data cleaning are to be done can be even more significant in reassuring readers that results are not cherry-picked—but we recognize that different subfields have different ideas about preregistration. To some, it involves clearly specifying predicted results, and for others, it does not. Hence, there is no substitute for reviewers actually examining the preregistration documents. We also note that it is now widespread practice for authors to have anonymized versions of these documents available for reviewers to examine. Reviewers rightly react negatively to papers that claim preregistration but do not make these materials available.
There is much to appreciate in the move toward preregistration. At the same time, it may have done less than originally hoped to increase the rigor of many areas in sociology. Further, it may in some cases give an incentive for researchers to propose extremely “safe” research designs, which is not good for the field overall. And this brings us to the final issue we want to bring up here:
Third, we’ve been mulling over the idea of allowing preacceptance of papers based on research design. (These are often called “registered reports.”) The idea is that if preregistration may give people an incentive to conduct simple and safe research, registered reports do the opposite. Someone who proposes to undertake a difficult, innovative, and perhaps expensive project of gathering new data to answer (if possible) an important question is given courage—and perhaps, able to secure necessary funding—by having the resulting paper accepted at a journal whatever the results are … if the proposed plan is followed.
There are plenty of complexities here, and we actually doubt that this would be widely used in sociology. But we can imagine it being a way that journals can help move the field forward. We’re interested to hear what you have to say about this. Please feel free to pass on any thoughts, encouraging or discouraging, to the board.
Finally, we hope we don’t need to emphasize that none of this means that the AJS is looking to impose a one-size-fits-all model on social research, or that we are leaning away from ethnographic, interviewing, or historical work! And apropos of that, we’d like to announce that the 2024 Gould Prize is going to Anna Skarpelis, for her extremely innovative historical study of visual representations of race in Nazi Germany, “Horror Vacui: Racial Misalignment, Symbolic Repair and Imperial Legitimation in German National Socialist Portrait Photography.” Rigor need not come at the expense of creativity.
Excelsior!
John Levi Martin
Editor
____________________________________
January 2024
Dear Friends:
Happy New Year! It’s been quite an interesting 2023 here at the AJS. Today, I’d like to draw your attention to a special feature in the November 2023 issue (see the Trending page), and also to some minor changes in our instructions for authors and for reviewers that have come from some of the very productive debates about the use of preregistration in the social sciences. Our goal is to continue to respect the wide variety of forms of rigorous research in sociology while also learning how the reviewing system can incorporate emerging best practices.
Here’s to a great, or at least, nonheinous, 2024!
John Levi Martin Editor
____________________________________
January 2023
I am delighted to announce here the new website, and the mirrored documents at the University of Chicago Press, AJS web page. We will be using this space to help prospective authors, reviewers, and readers understand the AJS system, which we think is a very cool one and one that can play a very special role in the discipline. Most of what you’ll see here is us laying out clearly information about how we have long worked.
We also have more detailed information on how to prepare your manuscript and how to think about the review process. One change that will happen soon is that we will be asking authors about sharing code and/or data on our Dataverse account. If this isn’t relevant for your manuscript, don’t panic! It’s there for those who are using formal data analysis—not for, say, ethnographers. But since reviewers are increasingly interested in whether data are public, we’ll be asking you at time of submission, and passing that information (not the data) on to them. You can find out more about this in our Information for Authors area.
This space (“From the Editor”) will be used for informal communication, as well as making any other clarifications needed. Here’s to a great 2023!
John Levi Martin
Editor