New ethics courses in computer science

(nytimes.com)

126 points | by htiek 2257 days ago

33 comments

  • ChuckMcM 2257 days ago
    I grew tired of arguments supporting fleecing the users that were basically "We aren't making them do this, they choose to do it." I have heard them put forward at nearly every company I've worked at, at various levels and through various departments. At Google it was always "We don't take an editorial stand, this might be just what some of our customers want." That has been the most interesting aspect of their recent moves to either sanction advertisers or block them. So somewhere in there it has gone from "If people don't want to see ads there are lots of adblockers out there to choose from." to "We need to take a stand against abusive advertisers." And that is a huge difference in approach.

    Do engineers need a Hippocratic oath? I'm not sure they do, but if we forced some liability on companies for what the software they sell does, that would change a lot of things fairly quickly.

    • kbenson 2257 days ago
      > At Google it was always "We don't take an editorial stand, this might be just what some of our customers want."

      It's an interesting choice for companies, and Google in particular. Either be proactive and get accused of forcing your customers behavior or having ulterior motives based on money (e.g. Google's ad blocking program), or let them do what they want and get accused of turning a blind eye because it makes more money.

      That said, I have little sympathy for Google in this case. They made the choice to go for ad revenue as their business model years ago, and it may have been the most feasible path to success when they did so, but that doesn't mean the perverse incentives weren't obvious at every single step along that path.

      I work in the event ticketing secondary markets. That is, I work for a brokerage and buys and sells event tickets for a profit. I mention this because a lot of people have a very negative view of this industry (some of it misinformed, some very well founded in the actions of some bad actors). We run an above-board shop and make money through lots of analytics and targeted investment, and I sleep fine I night. I'm not sure I would if I was employed in certain departments of Google or Facebook.

      • Balgair 2257 days ago
        Google can't take an editorial position, they are forbidden (sort-of) from doing so by law: 47 U.S.C. § 230, Communication Decency Act (1996).

        Per the recent Wired article on FB: "This is the section of US law that shelters internet intermediaries from liability for the content their users post. If Facebook were to start creating or editing content on its platform, it would risk losing that immunity"[0]

        The EFF has a good piece on the importance that this law be upheld[1]. Basically, from ISPs to Craigslist, the internet can repost/report on potentially horrific stuff without being in trouble themselves as the 'host'.

        If Google were to take an editorial position, they are afraid they will run afoul of this law and be held liable.

        [0]https://www.wired.com/story/inside-facebook-mark-zuckerberg-...

        [1]https://www.eff.org/issues/cda230

        • wanderer2323 2257 days ago
          When you are training your search models you take editorial position all the time. There is no 'one single truth' of what the search results should be, let alone what the search-related artifacts (query suggestions, drill-downs, answers, etc) should be.

          Real people go to work every day to label images as 'corresponding' or 'not corresponding' to a query, different people write guidelines for these labeling, other people curate which queries to label and which labeled results to train, etc etc.

          In theory, all these people or at least their accumulated work produce some kind of 'neutral' result; in reality, a systemic bias on some of these levels can easily have an editorial effect that is impossible to prove.

          • Balgair 2257 days ago
            I'm certain that this is going to come up in anti-trust suits as a violation of Section 230. I can't see how it isn't editing the content that folk's see, despite the complexity of the ML. Judges don't like being told 'oh, it's too complex to understand, gee-shucks!' let us get away with it.

            Surprisingly, such a ruling will have some philosophic ramifications about the ability for a computer to think and then edit. Though the case is unlikely to hinge on that semantic point, it will be talked about in lawyer bars.

      • mulmen 2257 days ago
        I'm curious, what value does your ticket brokerage create?
        • kbenson 2257 days ago
          For customers, liquidity and price accuracy, ticket availability, and the chance for discount tickets (in the case where brokers make a bad call or execute badly, which happens often).

          For venues and promoters, guaranteed attendance and immediate cashflow (sell 50,000 tickets at an average of $80 immediately instead of spread over 9-12 months, that's money that can be invested back into their business or something else, and reduces risk).

          For artists and promoters, the capability to hold back chunks of inventory for later sale on the secondary market at increased cost. This allows them to take advantage of a functioning market to make more money while also avoiding fan displeasure at high ticket prices. Also, the ability to say they sold out X size venue in Y time, which can denote popularity (or be used to claim a level of popularity).

          Brokers take on risk for a possible reward. If you're buying tickets that aren't intended for immediate resale, and are holding them for 9-12 months (common), anything can happen in that time period. That artist may become less popular, or even just get sick and cancel much of the tour (in which case you just had your money tied up for months, at best losing out on other investments and at worst paying some percentage on a credit account), which is a loss.

          It's not really all that different than other financial markets.

          Edit: As much as some artists like to complain about the secondary market, there's a really simple solution that just works. Increase supply. Garth Brooks plays twice a night and multiple days in a row in the same venue at each stop on a tour. Kid Rock will play seven consecutive days in a row in Detroit. The downside? They move the risk from the brokers to the venue, promoters and artist, because they may lose money if they don't fill enough seats. This itself is an illustration of the role the secondary market plays, and indeed heavily bought events with inflated prices often get additional dates added which depress the market prices.

          • throwawayjava 2257 days ago
            Engineering ethics takes two forms: "what should I build" and "how should I build it".

            I tend to prefer "engineering ethics" courses that focus on the second question because the first question is completely parametric in more general ethical and even political considerations.

            It's true that engineers should take a course of study in pure ethics to learn how to think through questions of the first variety, but I'm not sure if engineering departments are the right ones to house/teach that particular course.

      • logicallee 2257 days ago
        >I work in the event ticketing secondary markets. That is, I work for a brokerage and buys and sells event tickets for a profit.

        FYI you work for a ticket scalping shop, which is why you put so much verbiage there rather than saying so.

        Based on what you've written, it's not clear what your firm adds to the world through its analytics and "investment". Which is why people don't like scalpers. Other than the scalpers, who else is better off if audience members pay a surplus over list price? (Or end up not going, as you've bought the last seats and priced them out.)

        This is why "scalper" has worse connotation than "advertiser." (I don't work in advertising.)

        By all means please let me know if you generate some value I am ignorant about, as you explicitly state is often the case ("misinformed").

        • kbenson 2257 days ago
          > Based on what you've written, it's not clear what your firm adds to the world through its analytics and "investment".

          Perhaps if you read the thread a bit more, you would have found my answers to these questions, and you could formulate useful questions that didn't ask for things that have already been provided.

          If you would like to follow up to that comment with questions or complaints about how I've presented myself or the industry I work in, feel free to do so. I would be happy to engage with you on any criticisms you have on the points I've presented, I only hope you approach it in a less hostile manner than you have here.

          • logicallee 2256 days ago
            The other comment by lovich says:

            >I've worked in that industry as well and I have a very negative view of it still, even discounting the bad actors. Everything you do can be above board and still be douchey.

            As far as your points, I have this specific question:

            - Where you write, "For venues and promoters, guaranteed attendance and immediate cashflow."

            Why is there "guaranteed attendance"? I don't understand what you could have meant by it.

            • kbenson 2256 days ago
              > - Where you write, "For venues and promoters, guaranteed attendance and immediate cashflow."

              > Why is there "guaranteed attendance"? I don't understand what you could have meant by it.

              That's an inaccurate description on my part for what I meant. It's not guaranteed attendance, it's guaranteed sales regardless of attendance (although generally the seats are sold, even if at an extreme discount). It's guaranteed money available early on in the event lifetime.

              For an example of this, look at this TicketMaster event[1], and the corresponding StubHub page. All reserved seating is $45 on TM, and there's still plenty of tickets available, and the floor tickets are $49.50, and they are still available as well. Now look at the secondary market (StubHub, in this example), and you'll see Floor seats starting at $19, Orchestra seats starting at just under $30, Lower Balcony seats starting at under $11, and Upper Balcony at $29 (I'm not sure why it's higher, I suspect this market isn't very liquid, and/or those are mostly non-brokers selling expecting to get more money back than is likely to happen).

              I have a few take-aways from this specific example (and it's by no means my job to make these assessments, I'm a software engineer, I write in-house tools and connect to APIs):

              - People expecting to make money on the secondary market here are losing a lot of money. Exchange fees are generally between 7-10% of sale price for brokers, depending on volume, and TicketMaster initial display prices generally are not including all the other fees. A single Floor ticket (listed as $49.50) actually shows a subtotal of $67.45 if you attempt to purchase it through TicketMaster right now.

              - The venue, promoters and artist have all that money put down for overbought tickets. Even if all the tickets held by brokers eventually sell, and even if they sold at a profit, those stakeholders have been able to make use of that money in the last three months since the tickets went on sale and most brokers invested, while the brokers have not. The artist and promoters have left money on the table in their pricing (which brokers attempt to capitalize on) but in exchange for that they get less risk (more sales at a low price) and more money at an earlier stage.

              This example is fairly generous to brokers in that it's an event where they are subsidizing users, but I think it's an important example to bring up because so many people don't even account for this in their reasoning. Whether brokers make money or not, some of the same things apply, such as stakeholders getting money early and having a fairly good (most the time) accounting of demand for a sale by outsourcing that aspect to the crowd. It's an added benefit that artists and promoters (and venues) they can take large chunks of tickets that were never sold on the primary market and sell on the secondary market for additional profit.

              In some aspects of what they do, ticket brokers are like high frequency traders, in other aspects, they perform other market functions (I'm a novice at best in the stock market, so I'd be hard pressed to explain this in detail).

              1: https://www1.ticketmaster.com/event/1B005363D07BB554

              2: https://www.stubhub.com/queens-of-the-stone-age-tickets-quee...

              • logicallee 2256 days ago
                Thank, this was very interesting. You should write a blog post about your understanding of the secondary ticket market, working in that industry. What you've written is an interesting contrast to the underlying assumptions outsiders have about it.

                I can't speak to whether you accurately see the underlying trading strategies or not but your writing was interesting. Thanks for taking the time.

      • lovich 2257 days ago
        I've worked in that industry as well and I have a very negative view of it still, even discounting the bad actors. Everything you do can be above board and still be douchey
    • batmansmk 2257 days ago
      I signed one oath in my engineering school in France to get my degree. I loose my title if I'm caught not respecting it. The title gives me access to a few jobs that require it (state, secret services, C-level of a few companies) as well as a special tax cut for companies using employees with the title.
      • throwawayjava 2257 days ago
        > as well as a special tax cut for companies using employees with the title.

        This is an interesting point in-between zero intervention and over-bearing puritanism codified in law. I'm having trouble finding more information. Could you share a link?

    • platz 2257 days ago
      Imagine the form of this logic applied to politics.

      'We don't have any policy positions or visionary goals. Those come from what our voters want'.

      Ostensibly, this is what the system is designed to do. But of one believes elected officials have no agency or accountability, they are fools.

      Now politics is different from media, though. Maybe one way is that a media platform is remaining impartial which a politician cannot do. So maybe the OP quote is better targeted at non-social network companies re ads @ google

      • zdragnar 2257 days ago
        If "the system" you're referring to is the USA, then no, it's not designed for that.

        We elect people based on ideology and character. They may have a few central policy platforms, but there are so many laws and resolutions passed that they cannot pre-advertise their positions or poll their constituents for every one.

        Hence, why we are a republic (or representative democracy, if you will). We elect people on the basis of trusting they will do the right thing, because noone has the time to track municipality, state and federal votes.

        Among the many experiments on local organization early communist China had was essentially a pure democracy (for local matters), and the inevitable result was there were so many votes that people were overwhelmed and disengaged.

        • Caveman_Coder 2257 days ago
          > "We elect people on the basis of trusting they will do the right thing"

          That's not currently working too well for us...

          • zdragnar 2254 days ago
            We are a very diverse group. I detest living in big cities, and recently bought 2.5 acres out in the country. Couldn't be happier. Many of the laws and regulations in the big cities wouldn't make any sense to apply out here.

            Likewise, purely popular votes on every issue would ultimately mean that the very tiny majority wins, every time, until the squishy "middle" voters get sick of it and switch sides again.

            The net result is the same. One side feels left out at any given point, and the direction of the country zigs and zags.

            The modern attitude of extreme hyperbolic reaction to every little detail could have many causes; my money is on social media (where speaking often is more rewarding than listening thoughtfully). It also doesn't help that our news sources are in such fiscal dire straits they seem to think the only newsworthy items are those that stoke rage or FUD.

            Things are going reasonably well, all things considered. Lots of things could be going a lot better, of course, but I suspect that's always been true, and will continue to be true.

    • kazinator 2257 days ago
      Indeed, there is a problem with the "they choose to do it" argument: namely that if enough clueless people around you choose something that you don't want, the choice may be foisted upon you by social forces.
      • TeMPOraL 2257 days ago
        And, from another side, if you only have few providers (economy of scales, network effect, or other barriers to entry for competitors), it's the providers who choose, and users only get to pick from what little is available.
    • skybrian 2257 days ago
      For ads I thought Google's usual justification was "some ads are useful."

      It will be interesting to see what happens in response to Chrome's new ad blocker. I'm not sure there's that much difference ethically between opt-in and opt-out, but certainly a big practical difference, and it will change the ecosystem.

    • ben_jones 2257 days ago
      All it takes for a 22 year old to write invasive ad tracking code, VPNs that spy on users, deceptive interfaces that steal tips or addict users, is around $150k combined compensation.

      Can we stop pretending we (~tech/software) are superior to other industries? We are as bad if not worse then finance, big oil, etc.

    • artificial 2257 days ago
      >if we forced some liability on companies for what the software they sell does, that would change a lot of things fairly quickly.

      Absolutely that would change things, especially the cost of software.

      • kerkeslager 2257 days ago
        I'm so tired of people saying stuff like this as if I should care. Go ahead, raise prices. If your product is solving a real problem, people will pay for it. If not, well, we didn't need you anyway. This is, incidentally, how capitalism is supposed to work.

        If your business model doesn't work without you doing bad things, then it shouldn't work. You don't have a god-given right to make money and if you can't make money doing prosocial things, you don't deserve to make money.

        Free software has been doing the right thing for decades. The internet of the 90s before everything was carefully tracked and to addict and display ads was better. There are lots of incentives besides money out there and if money is the only one you care about, I'm not on your side.

        • artificial 2245 days ago
          It isn’t necessarily that. What incentive do you have to risk your livlyhood for no compensation? For example there was a flaw with OpenSSL, the contributors would be on the hook for all the damages that businesses who used the affected software.

          An operating system would be hundreds of thousands of dollars to buy. Look at how actual Engineers are licensed and bonded.

        • thomastjeffery 2257 days ago
          Usually the unethical things are designed to corner customers into spending more: DRM, proprietary game server hosting, etc.

          The problem we have with capitalism now is that it is abused by large corporations who fight for their right to monopolize and abuse customers. What everyone else has lost is the right to compete without abusing customers.

      • jakelazaroff 2257 days ago
        > Absolutely that would change things, especially the cost of software.

        In other words: the current cost of software is artificially low because companies treat users unethically.

        • thomastjeffery 2257 days ago
          It's also artificially high for the same reasons.

          I would posit that prices are artificially high in more cases.

          • jakelazaroff 2257 days ago
            Are you saying that the cost of software is artificially high because companies are unethical?

            That doesn't really make sense: consumers value both price and ethics, and are often willing to pay a premium to companies they perceive as more ethical. If a company could come up with a product that's both cheaper and more ethical than their competitors', they'd easily win over all their competitors' customers.

            • thomastjeffery 2257 days ago
              It depends where you look, and what you consider to be unethical.

              One example is the game Battlefield 4: Dice/EA does not release the server software, so in order to host a server, you must rent one.

              This means that only those who have a private deal with the company can host servers, leaving people in places like west Africa underserved (no servers under 100-200ms), and giving those who pay to host servers unneccessary authority over players (arbitrary rules, reserved slots, etc.).

              This creates an artifical market based on copyright, and allows Dice/EA to get more money by abusing their customers.

              > That doesn't really make sense: consumers value both price and ethics, and are often willing to pay a premium to companies they perceive as more ethical.

              Perception is not reality. One problem is that people have been trained via propaganda to respect copyright abuses like DRM.

              Because there are enough people respecting these abuses, I am forced to accept the abuses as status-quo.

              There are plenty of other cases where a company constrains their customers liberty in order to coerce them into paying more. It's a problem that is exasperated by blind anti-regulation policies and setting monetary increase as the ultimate goal and ethos.

  • nemild 2257 days ago
    If useful, I wrote my own thoughts on ethics in software, after reflecting on certain experiences over the years:

    > A serial tech entrepreneur in Silicon Valley once asked me to design a “social stockade” for his financial services customers. It would lock people out of their social media accounts and tweet out/FB share to their friends when they hadn’t paid a loan. He pitched it to prospective employees as meaningful work that would reduce the cost of loans for the needy.

    > I was horrified that his product was being built and that many others would likely take the role I was turning down. And he was hardly the first to pitch his “innovation” as providing only good.

    https://www.nemil.com/musings/software-engineers-and-ethics....

    If anyone ever wants to discuss something, feel free to reach out (see HN profile).

    • nostrademons 2257 days ago
      Interestingly, many microfinance programs that are widely heralded as having lifted many people out of poverty (eg. Grameen Bank, CARE) rely heavily on peer pressure to boost their repayment rates. They lend out to groups within a village, and then if any one member of the group fails to repay the loan, the group can't access more capital. This creates a strong incentive for other members of the group to exert social pressure to make sure everyone pays back their loan.

      ...which goes to illustrate the complexity of most ethical issues that arise out of social systems. Oftentimes, something that seems cruel to an individual within the system is actually in the best interests of the participants of the system as a whole, and sometimes can even be in the long-term best interest of the person themselves. And then whether you view such features as cruel & unethical or necessary & beneficial depends on your perspective & role within the system.

      (This could also be taken as a synecdoche for capitalism itself, which on a micro level is about as cruel as you can get - individuals compete in a race to the bottom to do things more cheaply, and nobody will help you unless it serves their interests too - but on a macro level is the most effective system we know of for satisfying consumer wants.)

      • jclos 2257 days ago
        > They lend out to groups within a village, and then if any one member of the group fails to repay the loan, the group can't access more capital.

        That reminds me of those loan systems where you need to send naked pictures of yourself as a collateral, and that seemed to become popular in China (at least according to Western media).

        https://qz.com/707770/chinas-college-students-are-using-nude...

      • nemild 2257 days ago
        Absolutely, and I say that having worked in microfinance before receiving this request.

        But to me, that doesn't mean we engineers can't still draw a line somewhere, especially if we are called on to participate. Just because peer pressure works in some context, it doesn't mean that it is always the right choice, and we need to debate the tradeoffs in different contexts (much like engineers debate tradeoffs in any technical decision).

        For example, the easiest way to cut the price of loans down would be to kill anyone if they didn't pay; defaults — and loan costs — would fall dramatically. This would immediately provide loans to many people who are priced out. While that may be useful in some scenarios, it's not a system I personally believe in.

        • justin66 2257 days ago
          > For example, the easiest way to cut the price of loans down would be to kill anyone if they didn't pay; defaults — and loan costs — would fall dramatically.

          You state that as a hypothetical, but it's not like this has never been tried. The loans handled by lenders who include the threat of violence in the repayment plan are absolutely not characterized by low costs.

          • nemild 2257 days ago
            But loans for these activities (such as a loanshark) have their own set of risks that have to be factored in and affect the interest rate:

            - You may have no legal recourse and no collateral to seize

            - The loan may be funding risky or illegal activity with a high likelihood of failure, which demands a higher interest rate

            - There may be no competition that drives the price down

            Ceteris paribus, increasing the cost of non-payment should reduce interest rates. If you relax the "ceteris paribus", then all bets are off.

            Another way to see this is this question: if the lender had to forsake the threat of violence, would the loan price go up or down?

            • justin66 2257 days ago
              Introducing the threat of violence isn't smoothly adjusting a variable in a formula. It's introducing a gating factor that's going to keep not-desperate people from dealing with you.
              • nemild 2256 days ago
                I'm happy to discuss with you offline (see my profile). The point I'm trying to make is that increasing the ability for greater enforcement mechanisms, should — on average — reduce the cost of loans. As I point out, there are real debates about where to draw the line about what is appropriate lender enforcement that I've personally struggled with.

                I apologize that my example isn't perfect, and you're absolutely right, there is selection bias, unless there is little recourse for other products.

      • fvdessen 2257 days ago
        I know of a microfinance company that hires former convicts to reclaim payments and receives government subsidies for it as part of a social reinsertion program.
  • bpicolo 2257 days ago
    Some recent TechnoScifi has done a really interesting job getting this sort of stuff into the general public. Black Mirror, Altered Carbon are both terrific and dive into tech ethics to different extents, and the outreach there is many millions of viewers.

    That's sort of an interesting potential take on this - how do you take ethical questions and get drastically wider outreach for them (vs a static class). Revisionist history has a recent podcast along how satire sort of lives in an interesting realm here (and how modern, western satire seems to miss the mark).

    http://revisionisthistory.com/episodes/10-the-satire-paradox

    • kerkeslager 2257 days ago
      > Black Mirror, Altered Carbon are both terrific and dive into tech ethics to different extents, and the outreach there is many millions of viewers.

      Also Electric Dreams.

  • spodek 2257 days ago
    I like their goals, but the traditional academic implementation described in the article and in the syllabus the article linked to won't achieve them.

    If you want people to learn behavior, you can't lecture them into it, nor will talking about case studies or writing papers help. Look at the behavior those classes teach: analysis, reading, writing, debating other people's behavior.

    Active, experiential, project-based, exercise-based learning will do the trick. Many professors think "flipping the classroom" or having more class discussions is active or experiential, but it rarely is.

    You have to get students acting on their values, feeling their own values conflicting with each other on projects they care about involving people in their lives that they care about, having others depend on their actions, having to perform on something they created, not spelled out for them in a case study. Then they learn empathy, compassion, responsibility, initiative, self-awareness, and ways to act in challenging situations.

    If you want to be an artist, you have to practice making art. Art appreciation classes won't hurt, but they won't help, any more than reading about or discussing lifting weights will build muscle.

    The classes they describe are ethics appreciation or leadership appreciation. Well-intentioned, but limited.

    • ebenrock 2257 days ago
      This is very much like the gender imbalance in tech - starting to address it in college only provides a band-aid. This is a much deeper issue that starts soon after birth.

      As a CMU grad student they had the Reasonable Person Principle to help guide your actions and interactions. The principle states nothing about ethics, but generally that you should be open to others' concerns, practice self-reflection, and even accept that their viewpoints differ from yours. In spite of this principle being in place I definitely dealt with at least one very unreasonable person during my graduate studies there.

      I've been in several ethical conundrums in my career. In a couple cases my choice was the "lesser" unethical option among many. It can be difficult to make those choices when your career or employment is on the line. I've left jobs because I believed (or knew) the work I was doing wasn't quite on the level.

      Trying to recreate these scenarios, realistically, in a class room is pretty hard. Having discussions about ethics is nice, but probably not very effective. Could you build a course, or assignment, where the only way to get an A is to cheat or act unethically? Would that even be ethical for the university to offer?

      In my cynicism, this looks like a "cover-our-asses" maneuver by universities, at least in part.

      • rvo 2257 days ago
        The Reasonable Person Principle was the best thing I learned from CMU

        * Everyone will be reasonable.

        * Everyone expects everyone else to be reasonable.

        * No one is special.

        * Do not be offended if someone suggests you are not being reasonable

      • hopw_roewur_ne 2257 days ago
        Soon after birth? Partly, but this is a much deeper issue that starts soon after conception. Actually scratch that, it started soon after sexual reproduction and therefore sexual selection was introduced about a billion years ago.
  • ilamont 2257 days ago
    I attended the ARinAction summit earlier this month, and heard an interesting tidbit from the MIT Media Lab's Pattie Maes: She requires new students joining her program to watch Black Mirror (1).

    In contrast, when I attended business school one of the models held up to us in the very first week was the team at Harrah's who designed a loyalty program for frequent gamblers (2). I remember one of the professors or someone in a video interview we watched crowing, "it was like printing money."

    Neither the case nor the instructor had anything to say about the fact that this was basically a technology-driven scheme to extract as much money as possible from members of the public, including gambling addicts and other vulnerable populations. The "big question" at the conclusion of the case reads:

    When asked about the company’s long-term vision for its RM system, a member of Harrah’s RM team became thoughtful for a moment. He responded that, while all of the near and longer-term developments described above were critical, he thought there was one important aspect of all RM systems that needed further development. "What I’d really like to know — and I pose this as a question for researchers in revenue management — is how to integrate information about price elasticity into these systems. Clearly, changes in price affect the level of demand we experience. However, none of the systems we are familiar with capture this effect."

    1. https://theoutline.com/post/3167/black-mirror-mit-class?zd=4...

    2. https://pubsonline.informs.org/doi/pdf/10.1287/ited.1090.003...

  • subroutine 2257 days ago
    Whenever I hear about these ethics courses I'm mainly curios about what non-obvious substantive content being taught (because there is apparently enough to fill a semester-long course). Anyone who has taken one of these courses care to share something they learned that, before the course, had never considered?
    • lumberjack 2257 days ago
      The philosophical aspect of ethics is not void of substance. There are many ways to think about the ethics of a situation. You learn different frameworks of ethical thought and you gain new perspectives. Some people who are very ideological will write this off as useless bullshit, but only because they are very invested in only one perspective and not open minded enough to consider the merits of other ethical frameworks.

      To give you a programming analogy, it is as if you always programmed imperatively, using C, because you learned that organically as you grew up, and then you take this class and you learn about functional programming and object oriented programming and you learn how you can think of the same problem from a completely different perspective.

      Except with the crucial difference, that the end result will not necessarily be the same and in fact there is not always a right answer. But when there is not always one right answer it is better to know many possible answers and why they are possible answers, than only one such answer.

    • burkaman 2257 days ago
      My school had a required Computer Ethics course for CS majors. It was a bi-weekly discussion section with readings and a final essay, so not really a full size course. We covered intellectual property, privacy (SOPA was in the news at the time), net neutrality, and free speech/social networks.

      These are all obvious subjects, but the idea was to get people to read about them and debate them, rather than just have an uninformed opinion and never discuss it.

    • oh_sigh 2257 days ago
      I'm sure there is non-obvious substantive content, but it is more akin to navel gazing than real-world applicable knowledge.

      https://aeon.co/essays/how-often-do-ethics-professors-call-t...

      > Ethicists do not behave better. But neither, overall, do they seem to behave worse

    • matt4077 2257 days ago
      You can easily fill a semester just discussing the trolley problem.

      The most important aspect of such classes is invariably to teach that technical decisions have ethical consequences. Just getting people to think about those would be a big win. Too many people still take a "it's the person that kills, not the gun" to their task of building a better gun.

  • seabird 2257 days ago
    All of these noble ideas of ethics and social responsibility are great for everyone that feels bound by them themselves. Many of the rest can't be bothered. People who have to be taught that the missile they're building is going to kill people and that they should feel bad about it probably already knows that the missile they're building is going to kill people and they don't feel bad about it. You see this issue come to a head in computer technology because much of it doesn't have absurd cost/precision requirements like weapons, drugs/pharmaceuticals, etc. have.
    • taurath 2257 days ago
      I think people come up with all sorts of justifications for why what they’re doing isn’t wrong. Concerted effort to knock down those justifications and socially shame those that cling to them does work - consider all the people who don’t want to go work for an ad company or a defense contractor, or those who leave with one of the reasons being to escape the industry that they know is wrong.
      • peoplewindow 2257 days ago
        I think the opposite - people come up with all sorts of justifications for why what other people are doing is wrong. This lets them feel self satisfied, virtuous and perhaps a little smug effectively for free, and if their position is perhaps a little thinly thought out, well, no big deal, it's not like anyone is going to listen anyway.

        I've watched many attempts to tar entire industries as evil over the years. Invariably the people doing the tarring look foolish or naive - like they can't think more than one step ahead, or like they live in a world where tradeoffs do not exist.

        To pick just the two examples you chose: without defense industries countries would be ripe for being taken over by even a slightly aggressive invader who would immediately commit all sorts of horrible atrocities. That's why defense exists. Given that countries have been invading each other for thousands of years, it's a massive stretch to believe we are in a post-war society and people who attack defense workers invariably never try to argue that. They don't seem able or willing to think the next step ahead: "ok, everyone refuses to build weapons.... then what?"

        And as for ads, if you remove all ads from the internet, TV, cinemas etc then all those things would suddenly become way more expensive. Good luck affording an internet connection if your daily browsing habit isn't being subsidised by advertisers anymore. That would be a fast way to ensure nobody poor could use the internet. Do you hate the poor? Probably not: more likely you never thought about the consequences of not having advertising.

        • taurath 2257 days ago
          Your examples are rather extreme and not what I'm advocating at all - there's a big difference between the Navy and Blackwater.
        • Maybestring 2257 days ago
          >Good luck affording an internet connection if your daily browsing habit isn't being subsidised by advertisers anymore.

          What portion of ISP revenue is from advertising? I suspect it is vanishingly small.

          • peoplewindow 2253 days ago
            ISP revenue - none.

            Revenue for all the free services and sites the ISP connects you to - almost all of it.

      • seabird 2257 days ago
        Consider all the people who do want to go work for an ad company or a defense contractor. Ethical outlooks are relative, but your outlook bother people who are "unbound," even if their outlook bothers you. Go ahead and knock down those justifications all you like, but some people can't be swayed. You could go ahead and jail them for their unsavory take and they would see themselves as a victim of a society that denies them their freedom.
        • taurath 2257 days ago
          I certainly wasn't saying that all or even 20% of people will be swayed. Just that it can make a difference.
    • maxerickson 2257 days ago
      Your example isn't interesting. It's starkly violent and you isolate the builders from the decision that results in violence.

      In many scenarios, the people building a thing will also work to push it out into society and inflict consequences less than death. Do you think they should not consider the consequences prior to "disrupting" things? The fact that some other people might not bother to consider the ramifications of their actions isn't an excuse to act irresponsibly.

    • Pica_soO 2257 days ago
      Usually its just another job creation scheme for the philosophy department. If corruption or illegal activity is found after that- the only answer can be - employ more philosophers. Only the church of liberal arts can grant you salvation, if you vow to employ more liberal arts majors.

      I love especially the codes of conduct on web-pages- usually there is only one click away a map of all locations the company is active. If its active in any Arab country and in Israel, it had to bribe somebody just to get off that anti-Semitic blacklist you get for doing business in Israel.

      So proof of corruption and code of conduct out in plain view next to another. Got to love this shiny new world.

  • mikegerwitz 2257 days ago
    I'm giving a talk in March at LibrePlanet 2018 entited "The Ethics Void". The lack of ethics in CS education is a core component of the talk. I'm neck-deep in my talk research right now, so if you are a, student, educator, or anyone else with thoughts on ethics in CS, I'd love to hear from you:

    https://mikegerwitz.com/talks

    Unfortunately, the codes of ethics, courses, etc that do exist largely ignore user freedoms (in a software freedom sense) and the host of ethical concerns that come with it. If you have examples that _do_ address those issues, I'd really appreciate hearing about it.

    And please join us at LP2018 (hosted at MIT)!

  • zombieprocesses 2257 days ago
    We already have ethics in the philosophy department. I'm was CS major and I took ethics and liked it so much I double majored in CS and philosophy.

    Ethics has no place in CS, nor more than ethics is required in biology or physics or algebra.

    If universities want students to learn about ethics, then make ethics 101 a "required elective".

    • MereInterest 2257 days ago
      Your comparisons are rather odd, given that most universities do have ethics requirements for the sciences, focused on examples from the field in question.

      Biology: Do not repeat the Tuskegee experiements. Do not be the next Andrew Wakefield.

      Physics: Do not falsify data. Do not plagiarize results. Do not play fast and loose with statistics.

      I can definitely see corresponding examples being made for issues that affect CS.

      CS: Do not collect customer information that is not needed for the task at hand. Do not describe your machine-learning model as being free of bias based on race/sex, if you was trained with real-world data that may be biased.

      • Zak 2257 days ago
        The examples from other fields offered here relate purely to the academic field, while one of your CS examples is very much related to software as a business.

        Biology as a business: don't try to patent the world's food supply.

        Physics as a business: consider whether your client is building a weapon out of your work and whom they might use it against.

      • Rylinks 2257 days ago
        Chemical engineering: If you don't know what you're doing, your reactor will blow up and kill people. Here are some cases where people didn't know what they were doing. Don't be like them.
    • matt4077 2257 days ago
      Many natural science programs actually have classes in ethics, sometimes mandatory, sometimes as an elective.

      The Technical University here in Berlin, Germany, has one of the most respected departments of philosophy in the country. The reason: After WW2, the British mandated that "hard" sciences should never again be taught without ethics.

    • bonoetmalo 2257 days ago
      There are many elements of ethics unique to software developers, that should be taught outside of a basic ethics course
  • thomastjeffery 2257 days ago
    In an IT-related class I had in High School, part of the course was a lesson in ethics.

    Part of the "lesson" taught that if you have a good/unique idea, you should patent it, lest someone else get the value from it before you do, and that you should keep your software closed-source lest someone pirate it.

    There was no voice for free software or against the absurdity of software patents outside my own vocal retorts.

    This was part of a district-wide course, and probably contained popular ideas used by many other districts.

    Another thing I see in schools/colleges is that Microsoft will give free licenses to the school and students for their software so long as it is used and taught. Free software takes a backseat, and people are taught to use - and prefer - Microsoft's proprietary tools. Microsoft gets to control their target audience, and be seen as doing something generous, not abusive.

  • QML 2257 days ago
    Honestly, there should just be a required ethics component to all college curriculums; I am not sure why it needs to be technologically focused.

    I would actually say the tech industry is the least of our worries with concerns to ethics; the last two years, tech has been heavily criticized and as a result it seems that people in the field are willing to change.

    Can't say the same about any other industry.

    • spydum 2257 days ago
      Is this not common? Even at junior collgss ethics is typically a first year requirement
  • wu-ikkyu 2257 days ago
    "I am convinced that if we are to get on to the right side of the world revolution, we as a nation must undergo a radical revolution of values. We must rapidly begin [applause], we must rapidly begin the shift from a thing-oriented society to a person-oriented society. When machines and computers, profit motives and property rights, are considered more important than people, the giant triplets of racism, extreme materialism, and militarism are incapable of being conquered."

    -Beyond Vietnam (1967), Dr. Martin Luther King Jr.

    http://www.americanrhetoric.com/speeches/mlkatimetobreaksile...

    It seems futile to silo this as a problem of "tech", when really it's a problem of society at large: that profit motives are largely considered more important than people.

    • dragonwriter 2257 days ago
      > It seems futile to silo this as a problem of "tech", when really it's a problem of society at large

      Of course, but while it is broadly socially acceptable to criticize a particular industry or technology, criticizing capitalism is less acceptable; indeed, redirecting frustration at particular (and changing periodically) industries and away from the system itself is a key defense mechanism.

  • fortythirteen 2257 days ago
    I tend to tune out every time a journalist cites the main problem with "the dark side of tech" as "fake news".

    Firstly, it's devolved into a term to play upon the confirmation bias of people who think those who hold a different opinion than them did not arrive there out of differing life experience, but because they must be either stupid or evil.

    Secondly, there are many dark sides of tech that are of greater importance than what is usually a subjective assessment that news is fake; such as unfettered personal data mining, engineered addictiveness, cooperation with oppressive governments, and growing soft-censorship of users whose politics differ from that of major platforms' operators.

  • lostcolony 2257 days ago
    That's one thing that, in hindsight, I quite liked about undergrad CS at Georgia Tech. CS4001, Computer Ethics, was required.

    It didn't try to push a specific worldview, but rather asked people to consider the ramifications of technical (and business) decisions, to discuss them, and to recognize the stakeholders beyond just the company paying. At the time I went through it, DRM was a big topic, and big data concerns were beginning to be (especially as noted that the problem was more than just what you stored, it was what -everyone- stored, and the ability to correlate it; you had to consider what else was out there). I imagine the latter now dwarfs the former.

  • whatok 2257 days ago
    So the same places where all of these companies got their groupthink from is supposed to fix the problem?
  • deckarep 2257 days ago
    Also watch Black Mirror on Netflix. Beyond the satire, dark humor and guilty entertainment this show offers quite a lot to consider on this exact subject matter.
    • pdkl95 2257 days ago
      I very highly recommend this[1] outstanding analysis of Black Mirror. It's short, it explores how Black Mirror fits in the history of sci-fi, and is one of the most concise explanations of the root cause of this kind of "tech problem" (hint: it isn't actually caused by technology; it's how people use it. Technology is just an amplifier).

      [1] https://www.youtube.com/watch?v=hr9_DcO6G3A

  • woodruffw 2257 days ago
    I'm teaching a 1-credit class on ethical hacking[1] this semester at my university, and I'm really glad to see this issue gain serious traction in major CS departments.

    I only wish it happened earlier -- I wanted to help develop a (more general) CS ethics class around a year ago[2][3], but encountered resistance from instructors and professors over perceived impracticality and adding non-technical "burden" to the major.

    [1]: https://github.com/UMD-CS-STICs/389Rspring18

    [2]: https://news.ycombinator.com/item?id=14680425

    [3]: https://news.ycombinator.com/item?id=14106201

    • aoki 2257 days ago
      kudos.

      ABET-accredited engineering programs have had to document instruction in design ethics for a long time. anybody designing a good CS curriculum should know that and be thinking about whether CS really ought to be different in that regard.

      berkeley has offered CS 195 [0] for more than three decades.

      [0] http://inst.eecs.berkeley.edu/~cs195/

  • foxrider 2257 days ago
    I think it's a dubious waste of time, because forcing people to take ethics classes wouldn't meant that they are going to stick to the proposed ethics. I've took ethics class in my Uni back in the day because I was curious what it was about and it also was a fairly easy one to pass. The prof was nice and informative, but her conclusions were something I would disagree with all the time, and I left the course only with knowledge of some historical stuff, but none of my views on what's ethical and what's not had not been changed. If anything I only got to solidify my position based on facts provided by her. Unless people would adopt these ethics willfully there is no forcing them to.
  • workthrowaway27 2257 days ago
    I doubt these ethics courses do anything to change people's behavior.
  • samzeisler 2256 days ago
    This is incredibly long overdue and so important. My partners and I had designs on creating an ethical framework for technologists years ago but didn’t feel we had the platform or the reach to spread it. That’s no excuse for sitting Idly by and doing nothing but nonetheless, we are thrilled to see this now coming into play. I hope that the professors and professionals who are contemplating this will come together and form an alliance to create a national standard policy and statement around this.
  • drdeadringer 2257 days ago
    Someone recently suggested the podcast "Engineering Commons", and I'm currently playing catch-up. They had an episode about "Ethics", in which they discuss the topic with their guest.
  • tonetheman 2257 days ago
    It is interesting but this is in direct opposition to "make money fast and do whatever you need to do." When shareholders come first there is no way ethics will ever be involved.
  • jakelarkin 2257 days ago
    great, but how about also focusing on the MBAs and non-tech background managers/execs that fill out the 5-10 levels of hierarchy above line engineers at any BigTechCo. Not like bottom-of-the-rung SWEs necessarily have visibility or control of the company selling some ads to customers using them for dis-information campaigns. Or how about teaching fact-checking & propaganda skepticism to ALL citizens.

    too often the problems currently ascribed to "tech" are problems of society as whole.

  • lev99 2257 days ago
    Ethics is already required for ABET accreditation. Almost every good United States based university level Computer Science program is ABET accredited. I took an entire two credit long course on ethics to receive my degree, involving writing at least four essays. One of the essays discussed the ethical considerations for writing code utilized by the military. The idea that computer programmers can do evil is not new. While some universities are creating new computer ethics courses, creating new computer ethic courses is not new. How is this national news worthy? NYTimes has been increasingly disappointing.
    • IntronExon 2257 days ago
      In the absence of regulatory bodies, professional sanction and the like, its worth even less than most accreditations in the tech world.
      • aoki 2257 days ago
        ABET accreditation is for degree programs, not individuals. it sets curricular standards.

        most credible engineering programs maintain ABET accreditation, as it has been required for graduates to become PEs [0].

        [0] i recently heard that berkeley EECS is dropping its ABET accreditation, as this is no longer true: https://eecs.berkeley.edu/sites/default/files/abet_letter_to...

  • ssebastianj 2257 days ago
    Don't forget "Dark Patterns" [0]

    [0] https://darkpatterns.org

  • bobthechef 2257 days ago
    I sincerely hope that consequentialist ethics won't dominate those courses.
  • pascalxus 2257 days ago
    I've been a software engineer for over 12 years and have never had to make a single ethical decision. It's always, here's the spec: build it.
    • lovich 2257 days ago
      Whether or not you choose to follow orders _is_ an ethical decision.
      • pascalxus 2257 days ago
        The laws align pretty well with Ethics. Most things that are unethical are also illegal and hence won't be built by any legitimate company.

        How often do you think a company chooses to build something that's unethical but legal? Have you ever heard of a engineer who said to the PM or boss, "hey, i'm not going to build that because it's unethical". I've never seen that happen. have you?

        • lovich 2257 days ago
          Any payday loan company? The financial service companies that figured out how to obfuscate shitty assets and led to the 2008 recession? Companies that mine your data and sell it to the highest bidder and rely on hiding privacy settings or making it too complicated for most people to prevent? Companies like Equifax that hold on to important personal data and don't even do the bare minimum to protect it?

          To be honest I'm not sure how to even respond to the statement that the law aligns with pretty well with ethics. The law aligns pretty well with what the people with power want and in any study of ethics you quickly learn that legal != ethical

  • purple-again 2257 days ago
    Always a losers game. You implement it with ethics in mind, I do not. I win and you fall into obscurity. The law is all that matters. If I can not lose what I take, there is no reason for me not to take it (so long as I'm still chasing the 'fuck you money', morals are great after you have it).

    I understand it, lots of people have dreamed big dreams of how great the world would be if everyone else was just like them too.

    • KirinDave 2257 days ago
      Flip side of this: People who say this are usually folks trying to justify the fact that they're resorting to underhanded, abusive tactics to compete with talented, successful people who are not. It is the battle cry of the mediocre, the hallmark of scammers, the ultimate admission of the untalented and unworthy.

      I know a lot of folks in fintech and blockchain, and we've all skated the outer edge of what's defined by law. The folks who give a damn about setting sustainable policy and not exploiting customers? Those are the folks who are still around. Even big US national banks, notorious for their immunity to law and enforcement, are starting to feel the pressure. Rumor is, Customers left Wells Fargo in droves after the last kerfuffle and new account opening went down substantially at Citi after their money laundering fine. The hidden cost of bad optics is immense. And as data science makes the formerly invisible behaviors of the world visible, society's going to get a whole lot more capable of identifying "bad behavior" and punishing it.

      To say "The law is all that matters", in the context of these industries, is beyond naive. It's not just wrong, but it's leaving money and opportunity on the table. It's bad business AND bad optics.

      > (so long as I'm still chasing the 'fuck you money', morals are great after you have it).

      If this ethos is so effective, why are you still "chasing?" Or is this the rhetorical we? Or the royal we? I can never tell here.

      • kerkeslager 2257 days ago
        > Those are the folks who are still around. Even big US national banks, notorious for their immunity to law and enforcement, are starting to feel the pressure. Rumor is, Customers left Wells Fargo in droves after the last kerfuffle and new account opening went down substantially at Citi after their money laundering fine. The hidden cost of bad optics is immense.

        There are two parts to a cost/benefit analysis, and you're only talking about the cost part of unethical behavior, as if the benefit doesn't exist. With Wells Fargo, Citi, Experian, Bank of America, etc., when their scandals went down, did they lose more money in the scandals than they gained from their antisocial behavior? I haven't tracked all of these cases til their end, but at least with Wells Fargo, they did not.

        More importantly, did the individuals who made the antisocial decisions, who are protected by limited liability, lose money? I doubt it. Maybe they're not "still around", but they're happily retired in mansions and there are plenty of newcomers willing to do the same thing to get the same reward.

        It's not 100% of the time: there are cases when bad behavior actually results in a net loss. But I'd say these are anomalies and not the norm. And even if they weren't, antisocial behaviors are profitable enough of the time that some percentage of antisocial behaviors have a positive expected value when looked at probabilistically.

        • KirinDave 2257 days ago
          > With Wells Fargo, Citi, Experian, Bank of America, etc., when their scandals went down, did they lose more money in the scandals than they gained from their antisocial behavior?

          Wells: Yes, almost certainly. Citi: Good question. Every I've talked to from Citi seems to think it was a disaster that hurt the business bottom line. BofA: Not sure which BofA problem we're talking about here. Experian: They walked away. They're a great example of how the government SHOULD have come in and made it more expensive and then given that money back to people they hurt.

          On Experian, I do know that their scandal with fake credit scores direct to customer hurt that business badly.

          > More importantly, did the individuals who made the antisocial decisions, who are protected by limited liability, lose money? I doubt it.

          In some cases yes, in some no.

          > But I'd say these are anomalies and not the norm.

          We could make it the norm :)

          • kerkeslager 2257 days ago
            Keep in mind the chain of comments you're responding to started with:

            > [I]f we forced some liability on companies for what the software they sell does, that would change a lot of things fairly quickly.

            In that context, it sounds like you're saying we don't need liability regulation, we need data science to help consumers make better decisions, so that the economic downsides to antisocial action are higher.

            This is the repeated lie of laissez faire economics: that if we can just get consumers to become savvy they will stop giving money to bad actors and the invisible hand of the market will enforce ethics without having to resort to regulation.

            This has never worked. At best, regulation finally steps in after the companies have trashed people's lives and fines the company, and the people who made the decisions are forced to retire with their millions. At worst, the companies lobby successfully and their sociopathic business practices become not only the norm but the standard. Laissez faire economics is typically only espoused by businesses when they don't have the regulators in their pocket.

            Forgive my cynicism, but I don't think data science is the missing piece that makes laissez faire economics work. It's simply not realistic to believe that the average consumer will become knowledgeable enough to make ethical decisions on what they consume. Most people aren't savvy enough, don't care, or don't have the time. I like to think I understand most issues once I have the data, and I care, but I simply can't keep up with all the different companies and their misdeeds. The invisible hand of the market simply can't keep up with this problem.

            • KirinDave 2256 days ago
              > In that context, it sounds like you're saying we don't need liability regulation, we need data science to help consumers make better decisions, so that the economic downsides to antisocial action are higher.

              I'm not sure why you make it sound like an decision. We're already seeing that improved computing power and statistical methods, coupled with the falling costs of these, are giving us transparency which can guide both consumers and regulators.

              Its one of the reasons I'm a fan of "legible" societies once the asymmetry if power and information is overcome. We can all hold each other accountable.

              If you check my comment history you'll see I'm a big fan of the CFPB and generally want the government to make bad behavior more expensive, so I appreciate the rest if your post but you're preaching to the choir. I'm furious at what the Executive has done putting a scam artist at the wheel to dismantle it.

              • kerkeslager 2255 days ago
                My apologies for a confrontational tone then. Your previous post came across to me as being yet another defense of the idea that regulation is bad and incentivized corporations will solve all society's ills. That's possibly more a reflection of my own sensitivities than your communication. :)

                I will say that a ton of changes which should be made are low hanging fruit that don't need data science to prove. We don't need data science to prove that bailing out corporations when predatory lending goes wrong, or that fining corporations a fraction of the profits they made from money laundering, are ineffective enforcement.

                The underlying problem is that international corporations are, to some extent, above the law, and that is a much harder problem to address.

      • whatshisface 2257 days ago
        "Lots of accounts leaving after public relations disaster," is not really something that you need data science to detect. A chart, maybe.
        • TallGuyShort 2257 days ago
          Which is about what a lot of people claiming to be data scientists can do.
      • cheeze 2257 days ago
        > blockchain

        sigh

        • KirinDave 2257 days ago
          Plenty of people are doing legitimate things in this space. One need only look. Legitimate NON-currency (token or integrity) applications abound.

          In a very real sense, github was an early blockchain startup. Do you type "sigh" on an internet forum when people talk about github?

          • theoh 2257 days ago
            The "hub" aspect of GitHub takes a decentralised protocol and adds a trusted third party (github). The whole point of blockchain is to avoid creating trusted third parties, something people like Nick Szabo identified a long time ago as a problem for protocol design.
            • KirinDave 2257 days ago
              > The whole point of blockchain is to avoid creating trusted third parties

              Nonsense. The "point" of "blockchain" is to commit blocks to merkel trees with an agreed upon protocol. Your value judgement are just as unwelcome as the previous post's.

              You're also wrong about github, as its only a point of centralization for specific UI services. It need not be canonical, it's simply privileged.

              Please stop policing the direction blockchain conversations go with religious anecdotes.

            • kbenson 2257 days ago
              But all this is irrelevant to his original point, and how blockchain was referenced within it. People are reacting to the word and their preconceptions and ignoring the point, which is a real shame because those same preconceptions actually strengthen the point being made.
              • theoh 2257 days ago
                There's no need to police the direction the conversation takes. I was responding only to the GP, as a matter of fact.
                • kbenson 2257 days ago
                  You're right, my comment was better aimed farther up-thread than yours. I wasn't attempting to police the direction, just note that the tangent started with little explanation and possibly some misinterpretation.
                  • KirinDave 2257 days ago
                    Theoh is trying to derail because he has religious objections to the idea that github uses "blockchain technology." You're not "policing", you're resisting someone trying to take an offtopic turn to an existing conversation. Please be proud of this, I thank you for it.
    • miketery 2257 days ago
      > The law is all that matters

      Agreed. Lets change it.

      • jackhack 2257 days ago
        Unless you are sitting on a mountain of money, forget it. You must buy laws of any substance.

        Laws represent power structures. Power structures are buttressed with money flows. If you threaten either, it will not end well.

        Politicans know this. They profit from it, it is the product being sold. (Ever wonder how so many congressmen end their terms as millionaires?)

    • moomin 2257 days ago
      If the law is the only thing that matters, then it follows that more regulation will come.

      Maybe try self-regulating before putting more power in the hands of the government?

      • TeMPOraL 2257 days ago
        One would wish.

        The fact is, self-regulating against market pressures pretty much never happens - coordinating people against their perceived, short-term self-interest is hard. That's the very reason governments exist.

    • wfo 2257 days ago
      Sure this is true if we just let the market do as it wills.

      But we have many tools to deal with this problem in our historical toolbox that we have mostly left behind as we hard embrace neoliberal ideology: regulation with fines, regulatory bodies (that can make judgements on unethical behavior not explicitly written into law and mete out serious punishment), a criminal justice system that could be applied to executives, a troubling but powerful system of asset forfeiture designed to seize "ill-gotten gains" (and what is money gained from unethical business practices but ill-gotten?), nationalizing or breaking up companies to prevent concentration of power that can be abused, organized labor and collective bargaining. If e.g. CEOs who ran unethical enterprises were afraid of their assets being seized and being jailed the fairly accurate picture you paint of our current reality -- the unethical businessman who crushes anyone with an ounce of human dignity, worker and competitor alike -- looks a little different.

      I think it's pretty clear that voting with your dollar will never and has never worked, but some of the above tools might. Though they all of course come with their downsides.

      >I understand it, lots of people have dreamed big dreams of how great the world would be if everyone else was just like them too.

      Actually nearly everyone already is like me in this way: nearly every human being has a sense of ethics and sees a pretty clear difference between right and wrong that transcends a profit motive. It's just that it's not so easy to turn that ethical and moral sense we all share into actual power to change things, especially when power is concentrated in the hands of the unethical (for exactly the reasons you describe above). But that's the task before us, and that's why universities are making computer scientists take ethics.

    • commandlinefan 2257 days ago
      Well, lawyers can be and have been disbarred for behaving unethically - but that does depend on a few fundamentals: lawyers can't legally practice law without a license, ethics are well-defined, and there are hearings to determine claims of unethical behavior. To apply "ethics" to software, we'd have to have (at least) all three of these in place.
    • frgtpsswrdlame 2257 days ago
      Except the article mentions that they're targeting

      >the next generation of technologists and policymakers

      Sure tech workers in a bad system are coerced into unethical actions, but policymakers hold the power to alter the system itself so that they're not.

      • hobofan 2257 days ago
        > policymakers hold the power

        The cynic in me tells me that the winning unethical companies hold the power over the policymakers.

    • evanlivingston 2257 days ago
      I don't want people like you in the world.
      • dang 2257 days ago
        Please don't respond to a bad comment with another bad comment. That just makes this place worse, and usually provokes others into worse yet, which is why the site guidelines ask you not to do it: https://news.ycombinator.com/newsguidelines.html.
  • ProAm 2257 days ago
    Such a good opening line, "The medical profession has an ethic: First, do no harm. Silicon Valley has an ethos: Build it first and ask for forgiveness later."
    • Alex3917 2257 days ago
      Yeah but both of those are just marketing postures that don’t really have anything to do with the underlying realities of their respective industries.
      • dsr_ 2257 days ago
        It's necessary to establish a policy before you can enforce it.

        You can't argue about exceptions if you don't fundamentally support the legitimacy of the policy, either.

        "First, do no harm" is a policy statement. Once you accept it, you can make arguments about tradeoffs (pain management, amputation, chemotherapy, abortion), but without having a policy statement, there's no argument being made, just a free-for-all mess.

        • Alex3917 2257 days ago
          > without having a policy statement, there's no argument being made, just a free-for-all mess.

          Given that medical treatments are the second or third leading cause of death in the U.S., how much good is the policy statement really doing?

      • ProAm 2257 days ago
        I think it speaks well to start ups, VC culture and silicon valley as a whole actually. It is a bit posture-y but feel its accurate.
    • da02 2257 days ago
      Actually, doctors constantly prescribe dangerous drugs all the time: https://www.psychologytoday.com/blog/wicked-deeds/201404/pre...

      But, they get away with it since most people sue the drug companies instead of the doctors.

      During primetime in the US, you can see the ads for drugs. Wait a few years and you can see lawyers advertising class-action lawsuits for those same drugs during daytime TV.

      Then, when you bring it up, someone mentions, "doctors don't harm people. They have a motto to do no harm." Actually, most doctors ignore it and do what other doctors do. Just look at statin drugs: https://www.huffingtonpost.com/jacob-teitelbaum-md/statins-c...

      Let's see doctors prescribe magnesium citrate + fish oil instead of statin drugs. "Oh, but fish oil contains mercury!" No doctor ever says, "Be careful of drugs. One gives you side-effects and you will end up on a treadmill of drugs to cover the side-effects of the others."

      Doctors go on strike, deaths decline (in Israel): https://www.ncbi.nlm.nih.gov/pmc/articles/PMC1127364/

      And in the UK: "When doctors strike, fewer patients die": https://www.bostonglobe.com/ideas/2016/02/09/hoskins/QhjVuBH...

      It's all hard to accept, but Silicon Valley is no different. Except people scrutinize Silicon Valley and Wall Street more than they do their own doctors. Go ask your doctor about statin drugs and see the clever rationalizations they make. No different than the people here who say, "I can't program without NULL!"(https://www.destroyallsoftware.com/talks/ideology)

    • Spooky23 2257 days ago
      It’s the downside of the VC model where the investors are insulated from the activity and are just throwing money against the wall and seein what sticks.

      Ethical behavior is a cousin of accountability. Tech is about failing fast. When the management and investors in a firm have little accountability in the future of the individual enterprise, why would you expect them to make correct decisions vs expedient ones?

    • chasd00 2257 days ago
      "The medical profession has an ethic: First, do no harm.."

      I always liked that "do no harm" statement. Notice how it seems to imply "do only good" but it doesn't. Clever.

  • ytyutufhg 2257 days ago
    You cannot change human nature. You can only save yourself. Is that cynical? I wonder...
  • oakgrove1 2257 days ago
    Universities need more fluff courses so they can pump out more people with scam degrees. It fulfills the goals of creating more busywork trash courses and also pushes mind-control propaganda.
  • muninn_ 2257 days ago
    Instead of Harvard, Stanford, and MIT attempting to address "the dark side" I'd prefer to see people who aren't graduating and going straight to work at the companies who bring out the worst (and sometimes best) of tech's ethical problems address this problem or at least be heard. Do we really need Goog.. I mean Stanford lecturing us on ethics?

    I guess it's good that they've at least somewhat seen the problem, even if it's a problem because it harms future revenue streams. Maybe the NYT (which I have a subscription to) is going to tell me next that Clinton has a 100% chance to win the 2020 election?

  • rafiki6 2257 days ago
    Technologists aren't doctors. Doctors are a necessity because people have an incessant need to stay alive. Technologists aren't a fundamental necessity to society. We are business people. The only rules/ethics that govern us are business rules/ethics. Some might argue, "but technology is required for us to survive". It's not. We've survived plenty without it. Technology is required for us to thrive.
    • dragonwriter 2257 days ago
      > Technologists aren't doctors.

      Doctors are a subset of technologists.

      > Doctors are a necessity because people have an incessant need to stay alive.

      Medical care is an application of technology people are particularly willing to sacrifice other things to pay for, but not without limit.

      That's not the only application of technology for which they is true.

      > Technologists aren't a fundamental necessity to society.

      Hard to say that's any more true than of Doctors; you can have a society without doctors or without other technologists, but if you do, people will very quickly start assuming those roles.

      > We are business people.

      Most technologists are not business people. A few are, but that's incidental, not fundamental.

      > The only rules/ethics that govern us are business rules/ethics.

      At least in formal terms, this is true of information technology, but it has nothing to do with necessity or lack thereof (it's not true of lawyers, who are certainly not necessary to survival.) It's just that IT (unlike medicine, law, or proper engineering) lacks professionalization.

      • hueving 2257 days ago
        Using technology != technologist in this context. Calling a doctor a technologist essentially makes everyone a technologist and the term becomes meaningless.
        • dragonwriter 2257 days ago
          You are welcome to pose a definition of technologist that included the people you mean to include and excluded the people you mean to exclude, but by any coherent definition doctors are as much technologists as IT workers in general, though I can see some reasonable definitions that would exclude most of both and only include a few of either.

          But, in any case, the lack of formal ethical rules applicable to IT is not about how necessary technologists are or aren't compared to doctors.

    • giaour 2257 days ago
      This sentiment strikes me as naive. Doctors aren't the only people governed by a code of professional ethics: lawyers, civil engineers, and public school teachers can all have their licenses revoked for ethical violations.

      Setting the bar at "a practitioner's malpractice could result in death or injury" ignores all other forms of harm

    • TallGuyShort 2257 days ago
      My university had us take the same ethics class as civil engineers, etc., and there was plenty of case study material where lax standards in engineering (including computer systems) had killed people. We can live without technology, but it's a lot harder to live when the optional technology that a lot of people choose to use starts killing people. It's important for engineers to understand the role they can and should play in advocating for safety and high standards.

      It needn't be a focus of the education, but it should be a concern of universities to ensure students graduate with an understanding of the importance of integrity for society's good and how they can ensure their projects are ethical.

    • criddell 2257 days ago
      Technologists should study ethics and abide by a code not because technology is required, but because it's pervasive and often hidden.
    • wu-ikkyu 2257 days ago
      Without technology doctors/medicine would not exist (apart from perhaps foraging for medicinal plants)
      • romanovcode 2257 days ago
        The technology which is discussed here is not technology which you are referring to.
    • Apocryphon 2257 days ago
      To be specific, we've survived without monetized, big money-driven technology.
    • matt4077 2257 days ago
      This is obviously wrong on so many levels, but just to point out the most glaring flaw: How is ethics only relevant when people's survival is at stake?
    • purple-again 2257 days ago
      Well said. Those of us who came from a business background remember our business ethics in university well. The running joke, from start to finish, is a thank you for compiling together the things we should walk carefully and carry a big stick while doing.
      • grasshopperpurp 2257 days ago
        >The running joke, from start to finish, is a thank you for compiling together the things we should walk carefully and carry a big stick while doing.

        These are the types we should weed out of society.