Media Law Resource Center

Serving the Media Law Community Since 1980

Digital Home Page

Registration & Lodging




2018 Sponsors



Our Reception Sponsor



Our Breakfast Sponsor





















Home Digital

Legal Frontiers in Digital Media 2018


The eleventh annual conference on emerging legal issues surrounding digital publishing and content distribution

New Venue in 2018!

Mission Bay Conference Center
San Francisco, CA
May 17th & 18th, 2018

A Joint Conference of

  • Media Law Resource Center
  • The Berkeley Center for Law & Technology

The conference explores emerging legal issues surrounding digital content in today's multi-platform world. The Conference will feature six sessions running from 1:00 p.m. on Thursday, May 17, with an evening reception, through 12:30 p.m. on Friday, May 18.





This year's conference will include:

  • Keynote by Kara Swisher
    Kara Swisher, influential technology journalist and co-founder of Recode, will give a keynote speech on the current social and political climate for digital companies. She will tackle a theme that runs throughout our sessions this year, a shift in the attitudes of the public and public officials, who are increasingly expressing a desire that platforms take on more responsibility and serve as a filter to police objectional content, propaganda, and illegal activity. Are digital platform's responses meeting the challenges?
  • Face-Swapping Technology: Dignity, Privacy & the First Amendment
    New machine-learning technology is allowing even amateur video editors to conjure videos that convincingly replace people's faces with those of others -- frequently unwitting celebrities – to both creative and destructive ends. This digital face-swapping tech has been used for satirical internet videos and perhaps most famously to recreate a younger Princess Leia in the Star Wars film, Rogue One. In their most provocative form, these so-called "deepfakes" digital AI tools have been used to create X-rated content featuring the faces of popular Hollywood actresses grafted on to porn stars' bodies. The videos have already engendered a firestorm that has led to bans on even freewheeling platforms like Reddit and Pornhub. This short presentation will explore whether the law can keep up with this controversial form of speech, and whether a balance can be struck to protect the reputational and privacy interests of unwitting subjects while upholding First Amendment principles.
  • Under Pressure: Hosting and Unhosting Objectionable Content
    Increasingly, platforms have been under pressure on a number of fronts to take down, moderate and/or stop hosting objectionable groups and content, such as content originating from white supremacists, alleged sex traffickers, terrorist groups and the like. The pressure is coming from political forces seeking legal reforms, such as the recently passed Section 230 exception for sex trafficking (FOSTA) and EU regulations demanding accelerated removals; as well social and public-relations pressures, e.g., public outrage over Neo-Nazi groups online after the violence in Charlottesville. As platforms shift to a more hands-on approach to editorial control, how should they refine their own values and community standards to balance a safe online environment with free speech?
  • Combatting Internet Disinformation Campaigns
    Beginning with a tech tutorial on how fake news is created and distributed in an artificially viral way, this session will cover how bots and fake users are employed to manipulate people, and how advertising tools are employed to target particular users. Whether by foreign governments like Russia, or by fraudsters and other individuals wishing to influence opinion and actions on the internet for their own ends, misinformation campaigns have become an acute problem that social media sites are facing calls to address. Virality online can be good or bad, but how do we distinguish between the good and something that malevolently manipulates and undermines democracy? And how do we respond to bad virality?
  • Women in Tech: Is Climate Change Coming?
    It has been approximately a year since the Uber scandal uncovered a culture of sexual harassment in the tech community. While it has become clear through the #MeToo movement that Silicon Valley is not alone, the tech community also faces a dearth of female founders and executives which may be contributing to the climate of sexual harassment. Tech lawyers are not immune from harassment and discrimination, but have they also contributed to the problem by negotiating NDAs to silence victims? At the same time, is there a danger of an overreaction to allegations that fails to allow the legal process run its course? This session will examine the current climate faced by women in tech, and will discuss how the law, and tech lawyers, may fit into this puzzle and help shape the future of women in tech.
  • How Algorithms & Machine Learning Work
    This session will begin with a tutorial on how algorithms and machine learning work in order to provide lawyers with a better understanding of how that technology applies to solving real world problems. For example: how does machine learning help a review site spot fake reviews, a social media platform identify misinformation campaigns, or sites identify a banned user trying to rejoin the site under a new identity? Our tutorial will explore the limits of what algorithms and machine learning can and cannot do. The demonstration will be followed by a broader policy discussion, which will explore some of the practical, legal and ethical challenges of using algorithms.
  • Scraping by with the Computer Fraud & Abuse Act
    The Computer Fraud & Abuse Act was enacted by Congress in 1986, primarily as a tool to criminally prosecute hackers, in an era before the birth of the web and online publishing, when the internet was mostly used by a small universe of academics, government and military staff. Although the CFAA has been updated by Congress several times, its meaning -- in the modern age of universal internet access and porous digital borders -- has eluded courts: i.e., what does it means to access a computer without authorization? This panel will attempt to make sense of the various, often contradictory, judicial rulings in this area, and debate a better way forward which balances platforms' private property right to its data with the right of public access to online information.

UC Berkeley School of Law certifies that this activity has been approved by the State Bar of California for 7.5 hours of Continuing Legal Education credit (6.25 General Hours, and 1.25 Hours in Recognition and Elimination of Bias). If you are seeking credit for another jurisdiction, please check with your state bar to determine if California CLE credits are recognized, through reciprocity, in your jurisdiction.


Questions? Contact us at This e-mail address is being protected from spambots. You need JavaScript enabled to view it .

Joomla Templates by JoomlaShack.com