Category Archives: Technology

My Bitmoji

If I do anything fun with social media, I have my teenage daughter to thank. She’s the one who put the GIF app on my phone and helped me get an “avatar emoji” known as bitmoji. Without her, I would be helplessly uninformed about these and other features of communicating in the 21st century.

My bitmoji is definitely a little silly (she says “let’s taco about it!” with a taco in her hand) and I tend to be serious. But I have to admit that I like being able to respond to people with a funny cartoon that looks a little like me doing funnier things than I usually do.

A great article in the New York Times magazine this weekend about avatars by Amanda Hess asks a question, one I have been thinking about since getting my bitmoji: “Online, we present ourselves in ever-more-numerous guises across a variety of platforms. What does the ‘avatar’ we choose say about who we really are?”

Hess recounts the history of avatars from Hinduism to today. “In Hindu theology, Vishnu assumes various earthbound avatars — among them a fish; a tortoise; a half-man, half-lion — in an effort to restore order at times when humanity has descended into chaos. Now we’re the gods,” she says, “reinventing ourselves online in the hope of bringing order to a realm we can’t quite keep under our control.” When I paste a bitmoji image of myself into a text message, I don’t consider myself a god, but I see Hess’s point about wanting to participate in, if not reinvent, the world I’m living in.

Hess points out that we represent ourselves differently via different platforms and uses herself as an example: “On Facebook, I’m posed by a professional photographer, waist contorted into a slimmed line, eyes peering up out the window of a skyscraper. On Snapchat, I’m burrowed into my office chair, blankly blinking my eyes open and closed.” She is not alone. Although my bitmoji is my only cartoon avatar, I use different photographs of myself on Twitter, Linkedin and Facebook to communicate different things about myself, as do many others.

But I wonder, haven’t people always acted, and presented themselves, differently in different settings? At least to some degree? And isn’t it arguable that successful people know how to modify their affect or appearance to suit a given context? Don’t we all, for the most part, dress up for the opera, and down for the grocery store?

As I usually do when pondering these kinds of questions, I turn to literature. In response to Hess’s question about what our choice of avatar says about ourselves, I think about Jay Gatsby, a lower-class kid from the mid-west who could have won an Oscar for his performance as a monied New York blueblood. Gatsby’s charade didn’t last forever, and it cost him his life, but in my opinion it wasn’t criminal that he tried to be something that he wasn’t.

I probably owe my attitude toward human chameleons to another literary figure, Oscar Wilde, who was the subject of my undergraduate thesis and with whom I spent a good year of college. It was Wilde more than anyone else who helped me to understand and accept that identity is not a static state of being, but a fluid reality. He understood that better than most, given that it was illegal at the time for him to be who he was.

In our use of digital avatars today, we continue a long legacy of trying on different masks and seeing which ones fit.

When it Comes to Words Per Minute, Less is More

“The Power of Handwriting,” a recent Wall Street Journal article by Robert Lee Hotz, argues what many teachers already believe: that students who handwrite their notes learn better than those who type.

According to Hotz, faster note-taking does not correlate with deeper or even adequate understanding of the material. Researchers have found that “the very feature that makes laptop note-taking so appealing — the ability to take notes more quickly — was what undermined learning.”

Interestingly, digital note-taking does appear to result in short-term gains for note-takers. But after 24 hours, those who type notes start to forget the material they transcribed. Researchers at Princeton and UCLA compared the work product of students who took longhand notes and found that they retained knowledge for longer and more readily understood new concepts.

Hotz reminds us that taking notes by hand has been a key learning strategy since ancient times and tells us that “writing things down excites the brain, brain imaging studies show.” Adds Michael Friedman of Harvard, when we take notes, we actually “transform” what we hear, making information acquisition both dynamic and personal.

Any notes are better than no notes, say researchers. But teachers can attest to the greater level of focus they see in students who write down their thoughts as they listen and learn vs. those who type transcript style notes. The sharpest edge still belongs to the student who can distill and synthesize information as he/she hears it, and commit it to memory through a practice of writing notes by hand.

There is a reason we are sometimes allowed to take a handwritten notecard into an exam with us — in deciding what is essential information to have with us in the exam room, we have likely undergone a very rigorous and helpful process of separating the wheat from the chaff, and committed to long-term memory those very concepts we are most likely to be tested on.

When I Grow Up…

I recently went to the annual National Association of Independent Schools conference, where over 5000 administrators and teachers heard from speakers like Randi Zuckerberg, author of Dot.Complicated (and sister of Mark), and Jaime Casap , Google’s “education evangelist.”

Unsurprisingly, they emphasized the amazing potential of technology to help us do everything we do, including school, better. But amid all of the excitement about 21st century disruption and transformation, Casap captured my attention with a surprisingly simple statement.

He said that because the nature of work, communication, and community is changing so rapidly, we really don’t know what kinds of jobs or lives we are preparing our students for right now.

He said, rather than ask kids what they want to be when they grow up, ask them what problem(s) do they want to solve? 

Problem solving takes a lot of skill — for example, patience, creativity, an ability to conduct research and evaluate information, an ability communicate and collaborate. In many ways, these are the same competencies we needed twenty years ago to succeed in professions that we believed would exist. It’s just that today, as Casap suggests in his statement, we are fully aware that we are honing them for experiences that many of us can’t even imagine.

How do you prepare for the unknown? How do you train for a task that hasn’t yet been conceived? While these questions are challenging to answer, if schools work intentionally on behalf of and with students now, students will be ready for whatever comes their way.

Powering Up and Down

After a unanimous vote by the Academic Council a few weeks ago, our school decided to do a “Power Down Day” with students. No devices in classrooms (we are 1-1) and no cellphones in free time (we have an otherwise open policy). We wanted to see what would happen if we didn’t use technology in or out of class for one full school day.

We timed it to follow a visit from Patrick Cook Deegan who spoke to us about mindfulness, wellness, and education. As a college student, Deegan took a leave from Brown University to bike through Southeast Asia. He spent 10 days in meditative retreat, 10 hours a day.  His experiences led him to dive into a life of global activism, local social justice work, and innovative educational design.

We didn’t think our students would have Deegan-style revelations after one day without technology. But we thought it would be interesting to test an assumption about technology in schools — namely, that it has contributed to stress by creating higher levels of both distraction and, perhaps counterintuitively, accountability.

One of the biggest concerns we had was that students wouldn’t know what to do in their free time without technology. In preparation for the day, we gathered old-school materials like art supplies, soccer balls, board games and cards for those times in the day when students would otherwise be turning to their phones or computers. Teachers made handouts rather than uploading materials to Haiku, and students used pencils for note-taking rather than pecking at their keyboards or screens.

There was a palpable difference in the atmosphere as students arrived to campus on the morning of Power Down Day. Specifically, they weren’t looking down at their phones while walking. In study hall, some groups of students who didn’t have homework or didn’t want to get started on it sat on the floor playing Clue, Monopoly, and Scrabble. Students were considerably louder and more engaged with each other in the outdoor spaces and in some classes, there appeared to be more group work taking place.

In most aspects of the day, though, it was business as usual. We polled students and faculty afterward to see what they thought about the experience, and most felt that while powering down showed us how much we depend on technology, it did not dramatically improve our lives to set it aside. For some, the negatives outweighed the benefits. Many people said that it was just a “lost” day — somewhat more fun, but definitely less productive.

Regardless, I hope we do it again. Mindfulness is not simply taking time out of our usual habits and practices to think and reflect — but in order to be mindful, we do need to do just that at the very least.

We are back to normal this week — cellphones and other devices in full use everywhere, all the time. I stacked the games up in my office and invited students to come play when they need a break from it all.

I’m not holding my breath, but I did have one taker so far after school yesterday — my 7 year old. It was nice to hear her laughter when, after beating me soundly, the yellow and red Connect Four pieces fell to the floor.

What I Learned This Year in School

The idea for this blog came to me when I was on a plane from Cleveland to Charlotte after shadowing Ann Klotz at Laurel School. Shadowing Ann was a required part of my graduate work that turned out to be a life-changing experience.

I really dislike flying, so I was trying to think about the future rather than the present, in which I was buckled in a chair in the middle of the sky. I thought about Ann and what she represents to me and to many others she has taught or mentored. In addition to running a truly wonderful girls’ school, teaching English there, and raising her own children, Ann is a powerful writer who publishes her work in the Huffington Post and other places. Seeing her in action at school, and reflecting on her contributions to family, education, and journalism, I was inspired to create something, too. Thus, the blog: what I learned in school today.

My goal at first was to post twice a week. I figured that would be easy; after all, I usually teach every day, and that takes a lot of planning and thinking. But I was wrong — it was much harder than I expected to compose my thoughts and observations into succinct paragraphs, and I knew I needed to write in a way that was both short and sweet. I quickly shifted to posting once a week and, when things got very busy at school and in life, once every two weeks.

But despite the unexpected challenges, I vowed to stick with it because, in addition to wanting to create a platform for sharing ideas, I also hoped that blogging would help me to become a better teacher, mentor, and role model for my students. They know that I am interested in and a little worried about technology’s impact on things I cherish — identity, voice, relationships, reading, and habits of mind (like research, communication and the writing process). I wanted to show them that although I am wary, I can try new things. I wanted to join them where they are.

After one year and 54 blog posts, I can see that blogging has taught me a lot — greater discipline, for one thing. It has also pushed me to develop a clearer voice, to read more and talk more, and with more people. In truth, I learned a lot more than I anticipated I would learn. That’s how school, and life, are.

Some of the other things I learned this year in school:

  • That it’s important and okay to fail — as students, as parents, as people. I know I failed in some of my posts. But I also know that I gave each post as much of myself as I could. (See especially Jessica Lahey,The Gift of Failure.) Also, and phew, that vulnerability is a strength. (See especially Brene Brown, Rising Strong.)
  • That talking face-to-face is a human necessity. (See especially Sherry TurkleReclaiming Conversation.) And that with technology, as with everything else, adults need to model the behaviors we want our children to adopt. (See especially Catherine Steiner-AdairThe Big Disconnect.) I agree with Scott McLeod that if we don’t participate in the digital world with our children, we risk becoming “dangerously irrelevant.” (See his blog Dangerously Irrelevant.)
  • That we all still have a lot of work to do when it comes to discrimination, especially when it comes to race. (See especially Mahzarin Banaji,Blind Spot: The Hidden Biases of Good People.) And that across our nation — and in our classrooms — race always matters. (See especially Ta-Nahesi CoatesBetween the World and MeBryan StevensonJust Mercy.)
  • That Twitter is interesting and actually fun. Where else can you tag someone you don’t know and tell them, hey — I read your work, I liked it, and it made me think. (Thanks to Lindsey Mead, Nina Badzin, and the other fabulous writers @Great New Books.)
  • That a key to adapting to the many technology-driven changes in teaching and learning is to have more empathy — for teachers and students alike. Both are faced with an ever-expanding set of expectations in an increasingly competitive world. It often feels like the road to making things quicker, easier, sleeker, and smarter is filled with complicated messages and unwieldy tools. Everyone needs a bit more space and time to work it all out.

A final takeaway, but not one that’s in any way new, is that I am married to a wonderful husband. He reads my work amid his own deluge and has the affection for me to interrupt what he is immersed in to give me honest and helpful feedback. I’m indebted to him for his input – although I still don’t love hearing him tell me that what I’ve written “feels like two different pieces” or that it isn’t good enough (yet) to publish. Everyone needs an editor, but not everyone is fortunate enough to have one.

Ordinarily, I use a syllabus to frame a class so everyone knows where we’re going. But this year with my blog, the syllabus wrote itself as time went along — demonstrating once again that no matter where it happens, school is always in session.

 

Look at Us

herman-melville-quote

Jennifer Egan is one of my favorite contemporary authors. I loved A Visit from the Goon Squad and The Keep, and I think her short story in the form of tweets, “Black Box,” is one of the more brilliant feats of creative writing.

After reading a recent article that captured a great conversation between Egan and George Saunders  (“Choose Your Own Adventure”), I grabbed a copy of one of her earlier books, Look at Me, that I had never read before.

Published in 2001 but written over six years during the 1990s, Look at Me is a jarringly predictive novel — not only about the role of the internet and the way in which self-image is influenced by technology and social media, but also about terrorists, both foreign and domestic born, in America.

The novel, which was released before 9-11, is uncanny in the way it imagines and comments on the world as we have known it since 2001. I kept wondering as I read, how could she have possibly known the things she seemed to have known?

In her conversation with Saunders, Egan said that she had no inkling of what was yet to come. She thought Look at Me would be far-fetched, even funny.  “I had never been online when I imagined a lot of that novel, and I was projecting forward into what I thought was extreme, goofy satire.” But, she said, she “took a long time to write Look at Me, and some of what I imagined as wacky hypotheticals — for example, a type of self-branding reality-TV-ish website I called Ordinary People — had already started to come try by the time I published it.”

Of the character, Z, who comes to America to destroy it from the inside, Egan wrote in the afterword to the book, “Z had always worried me the most. I was afraid no one would find him credible … [W]hile it may be nearly impossible to read about Z outside the context of September 11, 2001, I concocted his history and his actions at a time when the events of that day were still unthinkable.” She concedes, “Had Look at Me been a work-in-progress in the fall of 2001, I would have had to reconceive the novel in light of what happened. Instead, it remains an imaginative artifact of a more innocent time.”

Whether or not we arrive at every terminus predicted in fiction is not the point. As long as we have the freedom to imagine the future, we may also have the power to shape it.

All Kinds of Minds

At the NAIS People of Color Conference in Tampa last week, I heard the amazing Dr. Mae Jemison talk to over 4000 teachers and students about 100 Year Starship, a project dedicated to making “the capability of human travel beyond our solar system a reality within the next 100 years.”

Beyond the obvious challenges of distance and time, Jemison highlighted the additional challenge of getting enough different kinds of thinkers into the mix so that true innovation can occur. She referenced data from Change the Equation supporting the fact that “the STEM workforce is no more diverse now than in 2001.” This is a problem, she said, because the promise of 100 Year Starship depends on diverse participants who dare to pursue coursework in fields still not typically felt to be open to women or minorities.

While making the case for educators to more intentionally steer students from diverse backgrounds toward STEM, Jemison also made a great case for “scientific literacy” in general, which I particularly loved.

Scientific literacy, as defined by the National Science Education Standards, is “the knowledge and understanding of scientific concepts and processes required for personal decision making, participation in civic and cultural affairs, and economic productivity.” Jemison’s point of connecting science, diversity, decision making, and participation in civic and cultural life made clear, without a doubt, that all kinds of literacies are relevant to 21st century learners and citizens.

Literacy, the ability to read critically and write clearly, is both foundational and transcendent to every endeavor — be it the challenge of developing a self, contributing to a society, or voyaging beyond the known boundaries of space.

Sufjan, Salman & Me

I’m not the first and won’t be the last to say this: it’s different, and usually better, seeing an artist on stage than watching that same person on screen. This week, I was up close and personal with two virtuoso performers, Sufjan Stevens and Salman Rushdie, on back-to-back nights.

Stevens took the stage shrouded in fog and bathed in orange light. Wearing all black, he and his band were nondescript, standing in front of a screen with projected images from his childhood. Pictures and old film of handsome people, babies, the beach, and birthday parties flickered behind the musicians as they played. His most recent album, Carrie & Lowell, is an homage to his mother and step-father, and his new songs provide intensely melancholy anecdotes and meanderings about childhood, family, grief, and loneliness. Moved by the haunting lyrics and unique sound, I sat motionless, watching and listening. When he sang how he “should have known better,” that “nothing can be changed,” and that he “should have wrote a letter,” I had to close my eyes.

The next night, Sir Salman Rushdie took to a very different, brightly-lit university stage wearing a blue blazer and khaki pants. He sat across from Mike Collins, host of a local NPR radio show, and cheerfully answered questions about his life as a writer and his view of the world today. Although he said that our world today is “demented,” he undercut his stated pessimism by referencing the brilliant, humane work of Shakespeare, Dickens, and even Jimmy Fallon. He said that “we are the storytelling animal” and that by telling stories, we create and define ourselves and the world. I loved his message as much as I have loved many of his books.

It was a special thrill to hear Rushdie’s accented voice and to see him — well, so alive. After years of hiding from assassins, he sat calmly with an amused expression on his face and spoke directly to a transfixed audience just as Stevens had sung his heart out to the crowd the night before. Being near these two artists reminded me of the power of moments of intersection with greatness, in person, without anything mediating or filtering that experience.

Get Proximate

Frank Bruni’s recent New York Times editorial, “An Ivy League Twist,” discussed a new initiative from more than 80 colleges, including all eight in the Ivy Leagues, to try to support candidates from diverse backgrounds in their college application processes. Students who may not have college advisors, like their more privileged peers usually do, will be able to navigate a special website that will give them information and tips on how to compile their applications and apply for financial aid.

As Bruni describes, optimists think this may be useful to the stated goal of bringing greater numbers of qualified minority or low-income candidates to elite colleges and universities. Cynics think this is a lame and perhaps perverse effort to create access and equity, and that it will instead have the effect of reinforcing exclusivity in the face of the Common App.

Time will surely tell a more nuanced story. But this debate got me thinking about a phrase that plays often in my head, a phrase that the sui generis social justice advocate Bryan Stevenson shares in his powerful book, Just Mercy. His grandmother, who loved him so much and hugged him so hard that he “could barely breathe,” repeatedly told him: “You can’t understand most of the important things from a distance, Bryan. You have to get close.”

Stevenson’s grandmother was able to see that proximity to people is what best helps us to transcend seemingly intractable issues. When I think about what it would really take for elite colleges and universities to reflect the diversity of American society, I wonder how anything other than a face to face conversation, or better yet embrace, can get us there.

About Face

Sherry Turkle’s book, Alone Together, was important to me and my work on digital citizenship at my school. She made a strong argument for the power of face to face communication and the potential hazards for people steeped in digital media — most primarily the loss of empathy.

I’m now waiting anxiously for Turkle’s new book, Reclaiming Conversation: The Power of Talk in the Digital Age. In particular, I’m curious whether Turkle’s essential thesis remains the same, especially in light of what I’ve read this week. Alison Gopnik’s “No, Your Children Aren’t Becoming Digital Zombies,” Sumathi Reddy’s “Pediatricians Rethink Screen Time Policy for Children,” and Teddy Wayne’s “Found on Facebook: Empathy” all contribute to the growing body of evidence that technology is less dangerous and possibly less disruptive than previously thought.

Contrary to many parents’ fears, says Gopnik, “teenagers’ experience in the mobile world largely parallels rather than supplants their experience in the physical world.” In other words, digital relationships run alongside or depend on real relationships, making them no more or less complex than they really are.

Pediatricians, according to Reddy, are in the process of reevaluating screen time recommendations for children. While it has been argued that children under the age of 2 should not have access to any screens at all, researchers and groups including Common Sense Media are suddenly wary of this advisory. Reddy cites James Steyer: “Some of the traditional recommendations, like discouraging all screen time before age 2, just don’t fit with reality circa 2015-2016.”

And contrary to the view that social media necessarily decreases empathy, Wayne argues that it actually has the potential to broaden our knowledge about and perspective on the lives of others. Further, it is possible that the younger the user, the greater the empathy. “The youngest generation may be the most amendable to screen-based opportunities for empathy,”

No matter where Turkle comes out on some of these issues in her new book, it’s clear that we have plenty to learn and observe when it comes to the impact of digital media on how we communicate, learn, and live. Of course, this makes developing and honing best practices even more challenging. But it is certainly fascinating to see it unfold, and to participate in shaping the future.