In our latest Office Hours series, we talked with computer scientist and executive director of online education at the College of Computing at Georgia Tech, David Joyner. Watch the following videos to see Joyner, who teaches several courses on edX, answer your questions about computer science and cybersecurity trends.
Office Hours is a video Q&A series, hosted on the edX Instagram page, where guest experts answer questions submitted by our learners.
Do you think cybersecurity will always be a must to learn in computer science?
The thing about cybersecurity is that to really build secure software, security has to be built into the software at every level. It's common for us to think of standalone tools for cybersecurity, like firewalls, as what we're really talking about when we talk about cybersecurity. But when you really look at how software is often hacked or exploited, it's often in layers that we don't really think of as the cybersecurity layer. It's things like input validation, or how you store data in a database, or how test credentials might be left inside the software after it's complete. Those things happen at every level and they always will. So some understanding of the cybersecurity implications of the code that you write will always be critical for every job in computer science.
What’s a position that will still be around in 10 years, and what’s in high demand now?
The areas we hear most about nowadays are analytics, cybersecurity, web development—and I don't see any sign that those are going to change or diminish anytime soon. If anything, they're going to be more and more important, but they're going to evolve. We've already seen that with web development. Developing for mobile devices has fundamentally changed what web development really is all about. We'll see that more and more with the emergence of wearable devices, augmented reality, VR goggles, everything like that. Those will all benefit from being able to tap into already popular sites and networks. It's sort of a chicken and egg thing.
For people to buy smartwatches and VR goggles, there has to be some benefit that they get from those devices, but for developers to develop for those devices, there have to be consumers who own those devices. So which came first, the people buying the device or the developers supporting the device? I think the way we bridge that gap is by finding ways to take existing value and leverage it in new technologies. So what I think is going to be interesting is that the jobs of today, like web development and cybersecurity, are going to be important to the jobs of tomorrow, like virtual reality and smart fabrics and whatever else comes next.
Is programming a basic skill everyone should learn? Which programming skills are crucial now?
I think everyone should learn to code for a few reasons. Everyone should get the chance to find out if computer science is something they enjoy doing. A lot of my students have said they never expected to like the topic, but once they experienced it, they loved it and they added it as a minor or changed their major. And so I think everyone should at least give it that chance. It's also a skill that's surprisingly applicable to everyday life. I've written little Python scripts that help me with my household budget or help me keep track of students' progress in my courses. They don't use any concepts that go beyond my CS1 class, but they're really powerful in terms of how much of my time they save. So I really think everyone should learn to code because you might not realize how much you're going to like it. And even if you don't love it, you might find out it's really useful. It can really make your everyday life easier.
Which type of computer science course is suitable for beginners?
The usual jumping in point for computer science is a course like my CS1 course. It's a programming course. It doesn't assume you've ever done any coding before, and it gets you started with the basics of writing computer code. But that's not the only place you need to jump in from. Computer science is about a lot more than programming. I also teach a human-computer interaction course, and you don't actually need to know anything about computer coding in order to take and understand that course. I've taken MOOCs on topics like blockchain and nanotechnology. And while I've benefited from having a computer science background, I didn't need one to get something out of those courses.
I think that's, ultimately, one of the things I like the most about the openness of education nowadays, and the MOOC movement in general. You don't have much to lose by jumping into a course you might not be ready for. You're not out thousands of dollars in tuition by signing up for a course that you don't have the pre-reqs for. If anything, you find out, I'm not quite ready for this, I'll go prepare, and I'll come back when I'm ready. So, why not jump right into whatever you're interested in and find out if you're ready for it?
Is there a shortage of AI skills today? If so, which skills and in which sectors?
When it comes to AI skills, we very often talk about how there are more jobs for people with AI skills and there are actual people, so there's a shortage. But I think that's really only half the equation because when we're talking about that, we're usually talking about AI hard skills, which I think of as, "I know how to use PyTorch. I know how to use OpenCV," which are popular AI libraries. What we really need more of though are people with what I would describe as AI soft skills, which is the ability to look at a problem and understand from the perspective of AI development, what's easy, what's hard, and what can AI contribute to solving that problem?
There's a great xkcd comic, where a developer is asked by a client to develop some software that can identify whether a photo was taken in a national park and whether it's of a bird. And the developer very quickly says, the first part is trivial and the second part is almost impossible. And that's what we need more of I think, is not people who necessarily can actually design AI agents if they have a very clear specification, but people who can look at a problem and understand if it's even a problem that AI is ready to solve.
How did you learn to write code?
I first learned to write code when I took AP computer science in high school. I first learned to love to write code when I realized I could write a program that could basically do my chemistry homework for me. We'd get these long homework assignments that had a lot of problems to work out, and they all applied known formulas. But they were just time consuming to work out by hand. I realized I could write a program where I would just put in the initial numbers and it would do all the calculations and spit the answer back out of me.
And I wrote the code. So I'm still the one doing the work right? Since then, honestly, most of the things I've coded have been kind of similar to that. Probably 95% of the code I've ever written has never been used by anybody except for me. But I use it all the time to automate repeated tasks, or speed up some analysis I was doing anyway. I've written code to help us with our household budget, to manage my fantasy football team, to do the formatting for video game guides I used to write. So I use it for everything. I think it becomes far easier to learn when you're working on something that you find personally interesting and when you know it's going to benefit you.
What’s the future of privacy in this technical and digital world?
I think the challenge with privacy in this new world is that in order to do whatever it is that we're doing with technology, we almost inherently have to digitize and store some data about you for some period of time just to accomplish whatever it is you're trying to accomplish. And what that means is that there is this mountain of passively gathered data that's growing all the time.
We think of violations of privacy like someone with binoculars peering through our window, but it's really more like someone going through our trash. Most of what they're going to find is going to be garbage and unusable. But occasionally, they'll find a credit card offer or a bank statement that they can use against us. And the same kind of thing happens with, I think, data privacy as well. And we can use that analogy to tell us what we should do about it. Just like we take out the trash often, data should be disposed of often. Just like we cut up credit card offers before we throw them in the trash, data should be stored in such a way that if someone does get access to it, it's not that usable.
It's going to be difficult, because data gathering and storage is so fundamental to the operation of many modern technologies. But I think it's doable through a combination of responsible design by product makers and by sensible precautions that we can take as users.
Enjoy our Office Hours series? To hear from more experts and submit your own questions, follow edX on Instagram.