Unlike Brad, I don't believe in free will, but I agree with Brad's answer. I just don't think free will is the crux of the matter.
Computers are not sentient and are incapable of feeling pain. To me, the question of whether or not computers are slaves is exactly the same as whether or not zippers or pencils or ovens are slaves. If you want, you can call them that, but as Richard mentions, that dilutes the word "slave." It makes the word synonymous with "tool." Anything we can make do-whatever-we-want becomes a slave via that overly-broad definition.
Since my definition of slave rests on sentience and ability-to-feel-pain, I would definitely think of computers as slaves if we had true, sentient A.I. So if you force C3PO or HAL9000 to do your bidding, you are (to me) a slavemaster. But we don't currently have computers that are remotely like those science-fictional beings.
But lets say you built a computer, magically endowed it with self-awareness (sentience) and the ability to feel pain. But you didn't give it free will (whatever that is). By my reckoning, if you forced that computer into a lifetime of painful servitude, it would be your slave, even though it wouldn't have free will.
I DO think we are slave masters to some animals. Many people keep dogs as slave, for instance. We don't call it slavery, and I'm not saying it's a bad thing. I don't think it generally is. But -- to me -- strictly speaking it IS slavery.
The problem with this type of question and premise is that it dilutes the meaning and impact of a very powerful word "slave". It broadens the context of the term so much that its more impactful and useful meaning of one person being a slave to another becomes less impactful. This is just the wrong way to think about the relationship between a developer and his or her tools. Computers are tools - albeit complex and powerful ones - just like a crowbar or a pair of scissors. Computers are not people, not even animals or plants.
People tend to dilute other terms as well. For example, in politics, if your opponent disagrees with you, they "Hate America". This dilutes a very powerful and useful world 'hate'. Evil is another overused and diluted term. real evil is awful, but many people call even minor things 'evil'.
When such terms are stretched in trite ways, or used in a manner based on the thinnest of analogies (like the questioner does here). It minimizes the true awfulness of slavery.
My request to the questioner is this - find a better way to ask your question. English is a very rich language, abundant with descriptive words. There is no need to stretch words to their breaking point.
I would argue something close to the opposite: computers and robotics give us the power to end slavery and everything like it.
An institution as horrible as slavery would not have persisted as long as it did if it was not near the efficient frontier in the space of forms of social organization. There have practically always been tasks that people don't want to do themselves, but *somebody* has to do if everyone's way of life is to be maintained. Forcing other groups of people to do those things while your group enjoys life more is a pretty good arrangement for you, if you have the power to get away with it and lack empathy for members of the other groups.
For the first time in history, we have a general-purpose alternative: almost anything that we can force slaves (or desperately poor people, or...) to do, we can devise a practical strategy for automating. If there's a "job [group X] [doesn't] want to do", the best answer is to reduce or entirely eliminate the human labor input required for the unsavory task; if you're not smart enough to do that, the next best answer is to allow price signals to reduce society's dependence on its performance. Any notion that we need *more* cheap labor is not just wrong, it's evil. A lot more so than it was a few decades ago.
Returning to the original topic, there are very tough moral issues that arise if/when we reach the point where some computers or robots should be interpreted as actually feeling things, or having wills of their own. I don't have any answers there, beyond the obvious rule of thumb "don't design constructs with these properties if you can possibly avoid it", and the observation that there's considerable overlap with the subject of animal rights. But for now, I think it's still reasonable to see them as pure tools, even if they're not always perfectly understood by their creators.
If someone likes telling people what to do and bossing them around, why would they choose to be a programmer when instead they could be a manager, supervisor, producer, director -- even schoolteacher, cop, etc? There's no shortage of people in the world who's job is to boss other people around. Programmers like doing exactly what they ARE doing: trying to get computers to perform useful work.