On “Routine” Computing at Scale as a Disabling Technology
Ever since I read Mark Gahegan’s discussion and definition of Geocomputation in preparation to run the AAG 2019 Urban Data Science Panel, something he wrote about has really stuck in my mind. Namely, his discussion of disabling technologies is both very cogent and intellectually flexible:
Disabling Technology (Gahegan, 1999) The late 1970’s and early 1980’s saw the rise of databases; large monolithic systems that standardised on interfaces, file structures and query languages.
On the 11th, I’ll be speaking at the Q-estival at the University of Exeter‘s Q-Step Centre. The Q-Step program is a program to promote a “step change” in quantitative social science skills in the UK. I’m the Bristol Geography lead for the program. I’m real excited to take my first trip out to Exeter, see the place, and talk about some interesting computational social science!
In my talk, I’ll present some work I’ve done with Eli Knaap & Serge Rey on urban social boundaries.
I recently finished reading Cal Newport’s Deep Work: rules for focused success in a distracted world. While it’s pretty reasonable advice (like the nugget “When you work, work hard. When you’re done, be done.” p. 154), there are a few really powerful parts that I found very affecting and reasonable. Since I bet most folks take one message from Newport’s very well-publicized media appearances, and that’s his digital minimalism message. While that’s a pretty important concept, it actually isn’t the main message in Deep Work.
Towards a replicable future for geographic data science Levi John Wolf, Sergio J. Rey, & Taylor M. Oshan Over the next few days, I’ll be attending the Conceptualizing a Geospatial Software Institute Workshop at SESNYC. The original submission is here. In it, I detail how important it is to ensure that open source community is built along side open source software. Given the role open source plays in Open Science and the Open government initiative at the NSF, my collaborators and I felt it was critical to represent this perspective at the workshop.
The University of Bristol Email Charter I really like the University of Bristol’s new email charter. But, I think it could be made a little more clear and actionable, especially since it can build off of previous work and thought about making emails clearer, shorter, and fewer in number. Below, I’ll go through the points listed in the original charter and try to make them clearer. I’ll also try to add some actions that add some teeth to the points.
I’m really excited to announce that my longstanding package to work the with US Census Bureau API, cenpy, has gotten some long-needed love and attention. The new method of working with the data is really slick (if I do say so myself).
Check it out in this gist! If you like it, hate it, or want to improve it, hit me up over on the project or on my twitter.
One thing I find so difficult to accept is crunch time. Not that I can’t cope or even succeed in crunch, per se. But, rather, I’m beginning to find it very peculiar that folks respond to difficult times through the faith in their ephemerality… as if
this specific difficult phase is going to pass
is somehow heartening.
Sometimes, when my job is at its worst, I hear this a lot.
When comparing a multilevel model to a fixed-level model, it’s important to consider how things are parameterized. For instance, let’s say you’re conducting comparisons between a no-pooling model and a partial pooling variance components model. In this case, we have:
$$ y= \Delta u + \epsilon$$
as the specification, where $\Delta$ is the dummy variable matrix, $y$ is the outcome of interest, and $\epsilon$ is the usual homoeskedastic error term for the responses.
The Geometer’s Angle What John O’Loughlin talks about in his recent presidential address in Political Geography strikes me as substantially similar to many of the things I’ve read from him on the field. Indeed, it reminds me of the same track I got from him as a PhD applicant seeking to work in quantitative political geography. I’ll never forget; right at the time of (what I thought and still feel is) great ground-breaking work in political science focusing directly on the new understandings possible from aggregate electoral data [1,2,3] he suggested that electoral analysis needed to go beyond this.
To help subpackages that depend on libpysal, the API will change shortly to be in line with our desired migrating.pysal.org API.
If you test your package against the pypi version of libpysal (which you get using pip install libpysal), you already know that your changes don’t break with respect to what exists. However, if you’d like to give yourself some lead time to detect if there are breaking changes in the development version of libpysal on github, feel free to follow these directions on how to set up optional tests on travis.