This is common at many semiconductor companies. Not only is it a ban at one specific semiconductor oem I know of, the usb-ports are disabled and the usb-ports on new issue computers are epoxied to prevent trying to use them.
Semiconductor technology is one of the areas of global-technological competition which surely benefit from secrecy. For example, several years ago one of the c-level executives at this specific OEM was caught taking pictures inside of one of their customer's facility. Only the engineering staff from that oem is allowed inside those facilities now, and only for maintenence activity.
It's a very secret industry, and getting even tighter now since China has been blocked from acquiring the latest semiconductor technology. China has so far invested over $40 billion in boosting their chip-making abilities, to acquire capital, IP, and knowledge-workers. Given their history of borrowing technology from other countries without attribution, if I were at the head of a semiconductor related company, I would clamp down on the secrets ahead of the danger.
It's always been an arms race and nations and companies have always strategically opened themselves up to acquire knowledge.
When Germany was trying to catch up to the industrialised UK, they send business people to the UK to copy the layout of factory floors to reimplement them at home. For the longest time, Japan only allowed trade under restrictive conditions to maximise exposure to foreign technologies while minimising the impact of foreigners in Japan. It's a story as old as history and I honestly don't get why people are upset about it.
It's even beneficial for the ecosystem overall because it helps transmitting knowledge and technology from advanced to disadvantaged regions, which in turn accelerates production and development.
If you honestly don't know why I will explain it to you. It is because it is theft. If I were to steal some or all of your possessions and give it to the disadvantaged, then I suppose you shouldn't honestly get all that upset about it, but I doubt that would happen.
> In economic terms, intellectual property is non-rival, whereas tangible property is rival. As a result, the “piracy” of intellectual property is simply not the same sort of zero-sum game that car theft — or theft of any tangible property — is. And that means that when Hollywood or the U.S. government says that music or movie downloaders are “pirates” or “thieves,” they are indulging in a bit of loose rhetoric. There are, in general, good moral reasons not to take what doesn’t belong to you. But as this video by filmmaker Nina Paley so beautifully illustrates, copying is not theft.
This is a popular argument but I think it is wrong because it uses the wrong analogy. I'll illustrate.
If I buy a candy bar at the store for $0.99 and then you steal it, now you have it and I don't. I'm out $0.99 of value and you stole $0.99 of value.
But what if you steal the candy bar directly from the store? You got $0.99 of value, but the store didn't pay $0.99 for that candy bar; they may have only paid $0.40. What you actually stole from them was two things: $0.40 of inventory and $0.59 of foregone revenue.
Even if you drop $0.40 on the counter as you walk out the door, it would still be considered a theft because the store gets to decide what their retail prices are, not you. You're still stealing $0.59 of foregone revenue.
IP products essentially have an "inventory" cost near zero. Yet, if you take a song or a trade secret without paying for it, you're still taking their foregone revenue. Just like dropping $0.40 on the counter for a $0.99 candy bar, you're dropping $0.00 on the counter for a $0.99 song or $1 million trade secret. Reimbursing someone's inventory cost (even if it's $0.00) does not mean there is no theft occurring.
While the IP itself is non-rival, the revenue from each sale is rival; that is, if you acquire my IP via a pirated copy then I still have my IP, but I don't have the revenue from selling you my IP.
This analogy is also wrong, exactly for the reason that IP is not a physical thing that can be taken - it can only be copied.
Yes, by leaving $0.40, I would be stealing $0.59 of foregone revenue, but that is because the shop doesn't have that candy bar anymore and can't sell it. In case of IP, even if I pirate it, you can still sell it to other people.
Of course you probably can't sell it to me anymore and that may be lost revenue, but it depends if I would be actually willing and able to buy that IP otherwise. And of course it may cause more competition to appear, which will negatively affect your business.
Your candybar arguement is absurd because you can't sell ip direct to consumers. You could manufacture a product that uses it or sell access via some supply limiting portal, but that is much different than actually selling ip.
Additionally it's not inventory because it is not finite. There is a distinction made by GAAP regarding valuation of tangible vs intangible assets because it is much more complex process to valuate intangible assets such as ip.
Physical inventory and IP are analogous in that they are both types of property that their owner intends to use to generate revenue (and hopefully profit).
> it is much more complex process to valuate intangible assets such as ip.
Please note that my argument above assigns a value of $0.00 to the IP itself.
Even at that valuation, you can still commit theft by robbing me of the opportunity to use that IP to generate revenue. By analogy: when you steal a TV, you are charged on the retail price of the TV (the amount you should have paid for it), not the wholesale price (the amount the store paid for it).
Your intention is irrelevant, limiting your opportunity is not theft. You're using that analogy as an emotional appeal to present the ip holder as a victim.
Any competitor could limit your opportunity by releasing a functionally similar non infringing product. Would you choose the word 'robbing' in that circumstance?
One would have to use your ip to create a competing product before it would even be copyright infringement, and it still wouldn't be theft since you cannot steal somthing intangible since by definition it only exists as an abstraction which is not the same as zero valued tangible inventory.
I think you need to go back up and carefully read from my first post. I'm not talking about competitive markets in general, I'm talking about a specific transaction where you end up with a product I'm selling, but I don't end up with the revenue I was asking for that product.
As an exercise: walk into Best Buy and pick up a $1,000 TV. On the way out, hand the cashier a check for the wholesale price of that TV. Have you just committed theft? And if so, what specifically have you stolen? You haven't deprived them of the TV itself, since you reimbursed them fully for that.
Allow me to present an intangible analogy to illustrate... If I see a chair at ikea and they own a patent on it and I go home and make a chair based on their ip, but don't commercially manufacture and distribute it then did I steal a chair? If it is stolen what have I stolen exactly?
I guess my point is that my point is that ip is a liscense to manufacture and/or distribute rather than a total monopoly on the idea or demand for the product.
Industrial >Espionage< does happen. (When IP is a Trade Secret) If one doesn't want to risk espionage they can patent something, disclosing every detail, in exchange for a state enforced exclusivity over the patent.
As far as taking externalities into account: If someone doesn't behave to be "carbon-neutral" are they stealing from everyone?
Furthermore, the position you are defending is presupposing the ontological position that IP is Property. (Which might not hold in other cultures/countries, specially a Socialist Country with Chinese Characteristics™)
Patent infringement isn't theft and winging it and saying otherwise without a good basis won't convince anyone.
Well if it's theft then it's been going on and used by everyone for thousands of years. To be upset about this is to be upset about gossip. It's an informal means to disseminate information. It will naturally occur in any sufficiently competitive system.
And the key point may be that the ones being 'stolen from' even allow it because the benefit they get from opening up to new markets usually exceeds the risk of having some of your IP copied.
It's not like the traders who ventured to Japan turned around just because they knew that the Japanese would try to catch up by any means, or that the British stopped trading with Germany. (Or that Apple stops to sell or build things in China)
Taken to its logical full extent you might as well build a SCIF. Intelligence agency standard for facility that can handle air gapped SECRET and Top Secret data. If you look at the office park directly west of fort Meade, MD, some of those have SCIFs inside them. Not cheap.
I think it depends on the tier at which the company does its business in. Is the OEM involved in the latest nodes, or are they working with sizes >100nm?
Secrets are kept tight for any company with any tech relating to the latest nodes <10nm. It just so happens that IBM has been working on sub-10nm processes for a long time, so they have a lot of secrets to keep, hence I am not surprised by the article heading.
> you could bypass this if you had admin rights but it would leave a trail
A move I've seen being put in place at several locations, is removing local admin rights from all users. Those with advanced needs, like developers, gets a VM which is limited to a specific VLAN, with no access to the production environments.
The principle is sound, implementation is ... difficult, to say the least.
And if you're willing to run a lot of screencaps or re-type the stuff you see on another computer you can still get the data out. Before modems were common in the hands of unwashed masses my friend and I would transfer files on the phone by spelling out blocks in hex. Slow but with a checksum every 16 bytes it was good enough to get some work done. If the data is high value enough it would probably be worth it.
Airgapped exfil is a whole research field. Priority one is making accidental leaks or infections all but impossible. Priority two is making large scale intentional leaks as slow and difficult as possible. As long as there are human eyes on the data, some level of leaking is possible.
Any host based DLP will monitor and block screen caps you also need to get them out some how.
Printing will also be heavily restricted and monitored.
Sure no DLP solution would beat pen and paper but that’s not a good way to exfiltrate data these days and a security guard checking people leaving a restricted area would be a good enough way to plug any leaks.
All of these technologies are great to prevent employees from casually stealing data but as history has proven, usually while the front is well guarded, the back end many times has quite a few holes for hackers to exploit. It's like they say, the most secure computer is the one not connected to the network.
People like to dramatize things way too much, most "breaches" are accidental there are 1000's of companies that manage to keep their data secure on a daily basis despite targeted attacks.
With a few exceptions it's much easier to either poach your competitors or simply buy them outright which is why "corporate espionage" is mainly employed by nation states these days that need to catch up.
It's also important to note that stealing a design isn't that useful these days since you are too far gone what would be more important especially in the semi industry is to know the characteristics of a specific design or process in order to be able to preemptively position your offerings to compete without giving up any unnecessary ground.
Well, bias towards "infrastructure as code" and minimise the need to actually access the servers. Read logs through Splunk, configure through version controlled Puppet (or similar tools).
The only machines that can actually SSH into prod at least have screen-session-recording software, perhaps are kept in a separate room, with a policy of two staff present at all times. The general idea is that the closer you actually get to being able to bypass the checks and controls, the more attention you bring to yourself and your errand.
Yes, it can't be Git and Puppet all the way down, at some point someone will necessarily have access to do something as root on the server that hosts the Git repo that Puppet runs from. But instead of that being every dev on every laptop anywhere in the world, you can make sure it's a very small group of people, from a small number of workstations.
This is difficult and requires a substantial and very competent team to implement correctly.
To be fair, they are not completely locked down like I may have made it sound.
The usb port restriction is absolutely true for laptops, but colo desktops probably do not have this restriction, or just a more lax restriction. Also, its not true for all laptops of the company, but usb access does require explicit permissions and a new-issue laptop, so they do have a list of people who have riskier laptops, and may need to be issued a different travel laptop vs engineering laptop.
The laptop's epoxied ports likely prevent distracted traveling workers from having their laptops hacked with usb-keys inserted when they aren't looking.
Another common approach is thin clients that connect to VM's running on servers in protected rooms. My company does that. From there, most critical apps are web apps they monitor. Specific things that need Windows or Linux run in the VM's. There's even some solutions that support connecting USB devices to thin clients for use on remote VM's with some security on that process. I have no idea if that security is good but there's potential for secure devices designed for these use cases.
While visiting a Chinese division of an American company I worked for about 8 years ago they had Dell desktops, in a tower configuration and they had a metal (steel) super structure around them. A case within a case.
You could unlock it, plug in USB, monitors, ethernet, etc. route the cables out through a slot and then lock it. You couldn't reach the the different ports and sockets when it was locked. It was all relatively stout too.
Interestingly, they did have internet in the building but you had to have management permission and go to the security office to access it. If you wanted to download something, it was a fairly involved procedure. They were paranoid about people stealing their stuff and then also very paranoid about misappropriating something they shouldn't and creating legal problems in Europe or North America. They didn't seem to be all that concerned about stealing other people's IT but they wanted to control everything such that it didn't contaminate anything they wanted to sell where it mattered.
I nearly created a situation when I pulled out my phone to take a picture of an "engrish" sign they had on the wall.
It'll be interesting to see how the security culture in the US changes or doesn't, a lot of younger developers would be put off if you attempted to somehow restrict their access to the internet from work computers or restricted your ability to plug a phone or some arbitrary device in for charging of whatever. At every job I've had this century it has been fairly trivial to download nearly all of their source code and take it home if you wanted to. Trying to take the politics out of it, we're very much engaged in a cyber cold war with a number of nations that absolutely want to take our secrets; I won't speculate on the real impact but the Russians did meddle in our election.
> Are cases locked / glued shut? It seems like it would be easy enough to connect a new usb header...
Well sure. I think the idea is to prevent mistakes and people just randomly plugging usb sticks they get from conferences as gifts or when they work from home. People will forget and do that anyway even if it is the rule.
When someone start disassembling the laptop or runs around with a screw driver trying to get the epoxy out of the ports, it would a bit hard to claim it was an accidental mistake. In that case, those people can probably find a way to exfiltrate data or to infect the system without the aid of USB keys.
Epoxy is a pain, just use a sticky connector envelope instead.
USB disconnect disabling whole USB host is trivial to code. You would disable on disconnect instead of whitelisting as there can be USB host bugs.
Add a boot password so that only admin can start the machine...
Plus the are solutions where drivers are also put in a VM to further reduce attack surface.
I live and work in China and move far than that a year. Also the NZ / Australian property markets are both in a bubble caused by Chinese cash investment by wealthy individuals, 100k CNY won't buy anything.
They could glue in a USB data blocker, which is a thing that plugs into a USB port and has a USB socket on the other end and has the power lines connected between the port and socket but not the data lines.
This is going to hurt everyone but really motivated hackers in there.
If I had to smuggle some data out of a PC I'd use the audio card at a very low volume but known frequencies to output the data the FSK way, leaving a phone near the speaker recording the audio, then at home I'd reconstruct the data just as old modems did 40 years ago.
Other methods could also be used: blinking a pixel (or the optical mouse led left on the phone camera), raising the cpu power usage by making it run busy loops according to data ones and zeroes (nobody would object a bluetooth power consumption measurement device connected to the PCs mains cord), or manipulating innocuous fields into IP packets such as the Time To Live: make it higher than X and it's a one, make it lower and it'a a zero; routers will ignore the trivial difference but a passive (100% undetectable) tap into the network cable still allows to sniff the data. Actually, unless the infrastructure is instructed to report Ethernet frames sent to wrong addresses, one could send malformed frames carrying data into the payload so that they could be ignored at switch level but still readable by a tap on the cable: that would allow data smuggling at blazing fast speeds, much higher than USB.
Within a business, security has to be understood in the context of risk analysis.
Will the measures taken damage the company more than the benefit of the increased security?
In my eyes, banning storage media without a practical replacement is on the wrong side of the risk analysis equation. Sure, it's not desirable that files can be moved without full access control and auditing, but if people don't have a tool that works the same way then all your employees who rely on it will suffer.
It would have been great to see a standard encrypted portable drive, probably with the decryption software on a different partition establish a foothold in the world. With the addition of DLP software, that could have allowed employees to continue to do their work while mitigating the risk of lost or stolen data.
edit: what this doesn't cover is transferring large files quickly, or transferring files to a computer that isn't currently configured with the systems required for file transfers.
With the rise of VPNs, fast home internet, and fairly ubiquitous public wifi along with laptops gaining enough power for most applications the question is what's the real use case that requires removable media that can't be accomplished with an internal FTP/network drive/email and using company laptops?
If I'm giving a presentation to hundreds or thousands of people in 30 minutes and a laptop goes down/won't connect to projector, presentation wasn't loaded or was corrupted on the speaker laptop, things go sideways in a bunch of other ways (and they do), I don't want to be in the position of depending on the network to pull down a presentation from somewhere. Especially if I don't have my own laptop with me as is at least sometimes the case.
There should be various other ways to get to a presentation and I even have a setup to present from my iPhone (which I've actually had to use as a fallback), but a USB stick is a good, simple backup that I've used more than once.
Sometimes security is annoying, and sometimes security means you don't get to do what you needed to do. An event with hundreds or thousands of people should have rehearsals and spare speaker laptops, but if they don't, your need to present doesn't trump your (or someone else's) employer's cyber security stance.
Show up at a building site without your helmet and hiwiz? Unless there's a spare set around, you're going home. We need to get into the same mindset around cyber security.
It's going to depend on the circumstances and, perhaps, we really will decide over time that moving USB keys among untrusted laptops is simply a bridge too far. But, literally just this afternoon, used a USB stick to transfer a presentation to a (supposedly clean) speaker laptop that hadn't been preloaded like it was supposed to be. The circumstances (~100 person event, no rehearsals or heavyweight IT support) made that seem reasonable.
Cybersecurity is indeed important. But it's about managing risk, not minimizing it no matter the cost. And what I'd do at an event like this one is quite different from Defcon where I'd take the most stringent precautions possible.
I just bought a new SanDisk USB 3 drive and noticed it comes with SecureAccess software. It uses 128-bit AES encryption. Better than nothing, we could think IBM a large and successful long term tech company could make their own. But that doesn't support the cloud vision.
A hybrid structure, that is, allowing the transfer of files through portable drives however only allowing it to be saved/read encrypted to approved devices through encryption keys managed on devices or through the network (Group Policy/etc) could work.
I work at IBM and I can get almost all of my development work down without logging into the company network when working remotely - quite similar to Google's approach. This is a relatively new development within the past few months.
Last meeting I was in with an IBM rep, he wanted to download some slides that were not part of his stock presentation, to address some specific questions we had. By the time he got connected to his VPN about 10 minutes had elapsed, and the download was going to take another 90 minutes. He ended up saying he'd email them later.
Corportate VPNs are great, but if they don't perform people are going to work around them.
Remote work and working remotely are two very different things. The former usually refers to making your own arrangements for normal working hours outside of commuting to a corporate office; the latter usually refers to off-hours or one-off arrangements.
IBM may not want people remote working, but they'll be happy to have people working remotely at nights, on weekends, etc.
First, if USB sticks are out for data exfil reasons, 4G sticks (and anything else that isn't a HID device) are out as well.
Second, such networks generally have an approved way of getting data on to them, typically involves handing your USB stick or DVD to someone with access to a particular machine with unblocked ports and specially firewalled off access to a particular shared drive on the network, and nothing else.
It’s one thing to ban stuff and another to actually provide practical solutions for people so that they can continue working. Banning stuff looks good on paper, but sometimes the clever work arounds people invent are then much worse than the original thing.
Practical example from past. Policy denies creating user accounts for externals. In big orgs you anyways sometimes have the need to have some external do work on systems. Idealistic view is that employee baby sits the ext guy and performs everything on behalf of him. But in reality people have their own work and deadlines, so it becomes tempting to just log on and let the ext work on your credentials.
Agreed - this doesn't seem like that big a step at all from a security perspective. The main change is that it will now force a bunch of folks to think about how they manage their data and their consulting teams especially will get annoyed with how "hard" it is to transfer that powerpoint deck for their presentation.
Leaving data lying around on a stick is one vector. Malware and other considerations is quite another that should be discussed more often. Especially if your posture is jelly-filled.
I'm guessing they got burned here at least once with a relatively large dataset that somebody left in a cafe somewhere.
Seems like the move addresses accidental data leaks through portable device misplacement. I don't think it can do much to protect against intentional leaking. I mean, they want people to be able to work from anywhere from the sake of business, so I guess this addresses 1 vector. Hopefully, they understand this.
It's not stuff getting out that's the problem. It's stuff getting in. This move addresses the certainty that if a malicious actor manages to drop a flash drive in the parking lot it's practically certain that it will be plugged in.
When you're a large global company like IBM you become the target of lot of groups.
I remember they came to our school to show off their new hot desk software they wrote. Look how efficiently we can use space! I saw it as saying your so unimportant we won't even give you a dedicated desk.
Now apparently you can't even use common tools to get the job done. If I had a big meeting I wouldn't take a chance on the network to keep my presentation.
Hot desk-ing mostly makes sense only if you're 1.) short on space and 2.) have a fair number of people who only come in intermittently. Otherwise, you're not really saving anything and you make it harder to find people who aren't sitting in a fixed location.
Presentations would be the big inconvenience for me. I always carry a backup on a USB stick when I'm presenting at an event or conference. Just today, I loaded a presentation onto a conference laptop that was supposed to have been preloaded but wasn't. I could probably have worked around by getting the presentation off the network but I never like to depend on WiFi at an event.
ADDED: Especially for an external public presentation, I'd get it onto a USB stick one way or the other.
>Now apparently you can't even use common tools to get the job done. If I had a big meeting I wouldn't take a chance on the network to keep my presentation.
Then you are an active threat to your networks security. You sound like you have an incentive and willingness to put in effort and take personal risks to circumvent the security protocol at your workplace. Looking at employees without bad intents, this is as bad as it gets.
This is a great example why social engineering is unlikely to go out of style anytime soon. People who think they know better and are confident in being qualified to take a risk are never in short supply.
First, using USB drives was never against security policy at my work - so your personal attacks are unjustified. But secondly my powerpoint presentation isn't exactly top secret classified material.
If we were talking about handling the private signing keys I would agree with you. Different types of data have different levels of security needed. Over classifying trivial data just makes it harder to get things done.
I am sorry if it came across as a personal attack, I didnt want to imply that you are personally negligent or unqualified to make that call. For all I know it is your job to make those policies at your place off work. The problem is a User with this attitude whos job isnt to make that call.
Strictly speaking, If a user in a workplace where this behavior is against the security policy acts like this or expresses this opinion, namely, that they wont be stopped by a security policy to hold their presentation, they are a user group that arent just a possible attack vector but a possible attacker them self.
If your company forbids the usage of USB ports, they do that for a reason, whether individual users think that is reasonable or not. This isnt just about possibly leaking data, but introducing stuff in an environment that is supposed to be closed. Differently put, as a bad faithed person, I will gladly borrow you my USB Stick to transfer your important files so you dont get in trouble for having technical hickups. I also will take you up on the "thank you" snacks and will also watch silently when you get in trouble for intentionally breaking security policies and infecting the network.
The problem with blanket policies like this for an entire organization is that they don't consider the type of work being done. A publicist for example has as their job to distribute information publicly, you aren't helping them by making it impossible to drop off a USB key to someone. A software developer has a need to install operating systems much more frequently than an average user. You can net-install but usb media is still a major vector.
The vast majority of people in an organization don't work with sensitive material. It makes much more sense to do this on a per department basis.
Policies like this that ignore on the ground reality and make it hard to get work done encourage abuse of the rules. When the rules are too draconian people will work around them and it encourages disrespect for other rules - especially the ones that actually do improve security.
That doesnt make individual users less of a security risk. This boils down how much harm an individual user can do, most of which the individual user doesnt have the full grasp off. It can be a simple as introducing something into a system and enabling an inside attacker or walking around with a audio keylogger in form of a usb stick. Your laptop with a borrowed usb stick can record the sound of someone way above your security clearance typing in sensitive data while being in the same room. This sensitive data might be a trivial as a cost center number.
Those rules arent draconian, someone somewhere was hired to make a risk assessment and found them to be necessary. You dont circumvent them period. Thats the responsibility of each an every employee. No matter if you think they are stupid, it isnt your call to make, except if someone hires you for exactly that, then they are your responsibility to do right.
Most attacks arent some espionage stuff, its plain and simply precaution against fraud and theft. And they target people who are to proud for their own good who never think of themself being at threat. Which is sadly something a lot of computer scientists have a bit of a problem with. No one wants to believe they could be duped like that, they surely would know it better. Security policies are there to not have to rely on individual egos and look at it more realisticly. Most of us will be duped when taken advantage of in a bad moment without enough time to think about it. Thats why people do it.
There is a reason even the CEO and CSO have to wear their badges. There is a reason a lot of Snowdens colleges started to get really scared once it became clear which credentials he used to access the files. And they were lucky, he could have been a criminal and sure as hell wouldnt have mentioned that those data leakages where his responsibility. He could have simply been a criminal transferring money in their names.
I would love to see the cost benefit analysis that showed a company wide ban on USB keys would save more money than it cost.
Meetings happen daily, an iffy wifi connection is enough to waste the time of everyone in the meeting (10 minutes x 15 people is 2.5 person hours). These mundane things happen day in and day out. Someone walking around with an audio based keylogger is dramatically less likely to happen (and this ban on USB drives wouldn't prevent that anyways).
If you ignore the cost of people's time then pretty much every security idea makes sense. But its not a good way to run a business.
I've come across this with data security policies at law firms, accountants, and investment banks.
They have to protect sensitive client data. Need to protect against illegitimate leaks of data. And the universal way that they communicate with clients, opposing parties, regulators? Email.
You can't force the SEC to accept your new encrypted cloud solution, nor can you require opposing counsel to use YOUR solution. And communicating with hostile parties is the entire organization's reason for existing.
"Simple and reasonable" IT approaches to problems frequently break down when faced with people whose job it is to deal with other organizations.
I too thought of it as being draconian at first but come to think of it, USB drives seem much less important nowadays than they seemed back in 2007. They're still prolific, but they aren't thought of as a boon as they once were. Maybe IBM is onto something.
Perhaps fairly inevitable eventual breaches of client data are more forgivable when a given firm can point to efforts such as this as evidence of having tried to keep a lid on leaks, and to elevate confidence that it is less likely to happen.
From what I've seen some firms' internal security practices in general seem more to be about appearances whereas other firms tighten and scrutinize everything as if the company's reputation and survival depended on it.
Most companies that are somewhat serious about their business and the privacy of their clients (and their clients' clients) have a removable media policy. IBM's is nothing special in that respect. Besides cutting down on exfiltration vectors it also nicely takes care of some ways in which you might end up with malware on your corporate systems.
If your company does not have some kind of removable media policy then you are probably working for a very small company.
Would it be possible to just disable the usb driver from the OS(or just limit it, to a very very few tasks). Without it being a burden for UX purposes. One could argue that an exploit might be written in such a way that, it could be triggered before the OS boots from the BIOS. But, at that point, any external user-land exploit would provide the same level of access to sensitive data.
Since USB is how you debug on Android and even flash the software, that doesn't ring true. If the Uber case is any indication, they instead give you all the freedom but essentially rootkit the system and keep very comprehensive logs and access controls.
Right, I don’t think they disable USB, but more like you’re not supposed to use it, unless it’s within your job description because they keep logs and will do a forensic file search if they system is tripped.
You do need USB. You still have to plug the Android phone into the PC every time unfortunately - then you can enable wireless debugging and disconnect the cable for that session. When you come back to your PC later, you have to plug in the cable again. I know this because I'm trying to learn Android development and it's infuriating.
Google's a bit more trusting of its employees. There's an expectation that you won't use non-Google services to store or transmit anything work-related, however. No code on laptops, no documents kept outside of G Suite, etc.
Absolutely - I don't understand why this is "new", "odd" or "difficult".
I work for an European company (travel/tourism) and the day they hired me (4 years ago) I got a corporate laptop with Windows 7 which would accept usb mouses, keyboards, charge smartphones... but would not "mount" anything that had storage.
Interestingly enough I can connect my (personal) iPhone, and I can read pictures stored there, but I cannot write to it - I haven't tried installing iTunes because I don't need to backup to my work laptop so no idea how it would work with dedicated sw, but for sure there is no need to use epoxy glue or physical locks.
(Granted, I never tried to see how strong the protection was because it's not my specialty and frankly I don't care, but it looks like it works well enough as it is).
Many companies do not allow the physical removal of “hard drives” and must be shredded before they can exit a door, which is why the “keep your drive” warranty policy is quite popular with many companies.
You're not wrong on their poor employee experience, but there's very little theater involved - there's good reason to protect their IP. Any firm that invests multi-millions in original design will want to protect their property from theft. Semiconductors were mentioned, but also think about someone like Pixar.
Presumably they use network security devices and can see every bit of data that leaves their network to flag possible exfil. Devices like those in use at a company I used to work for man-in-the-middle all HTTPS traffic (using certificates from a CA that corporate IT pushes to all workstations/phones under corporate control, so they're trusted), man-in-the-middles SSH traffic (presumably assuming users just always answer "yes" to the "is this the right key?" question, or the user says no in which case the connection is effectively blocked, so either way the network security device wins), etc. Data exfiltration via USB key is a very real threat that is largely invisible to the company when it happens... no audit trail or anything to look back at.
There are lots of forensics tools which block USB writes to prevent modification of data on drives under analysis.
However, the solution is to create tools that are easier to use and get the job done better than USB pen-drives. IBM seems to be all about the stick, no carrot. If they really enforce this (and as a former IBMer, I can tell you they are really long on pronouncements and very short on actual follow-through) I just foresee people bringing more laptops and multiple laptops around into places they previously would have brought a USB drive. E.g. if you can't easily move that Powerpoint from Laptop A to Laptop B because you're in some low-bandwidth hellhole, well, guess you're bringing both laptops along for the ride.