I see this a lot in consulting. When a new CIO (or CEO or other C level) arrives, they want to make their mark with a digital transformation intiative. This usually just means that the new C level employee is coming into a medium to large business and would like to add a bullet point to their resume and get that new shiny object everyone is talking about. Tableau, Salesforce, Data lakes, blockchain, ERP, Identity Management and "cloud" projects are often the result. It seems to also stem from the new C level employee having a close relationship with a sales rep/partner/C level employee at the vendor. Left a project a couple years ago that had Hadoop interfaces from every system. The user count of all this data? exactly 0.
One somewhat disturbing trend I've seen at some of the largest corporations- cut/outsource IT support staff to near egregiously low levels to "save money". At the same time kick off 7-9 figure ERP/consulting projects that at best provide fractional value to the organization.
Of course there are counterpoints to this. One of Houston's major pipeline operators pulled off a digital transformation and actually ended up with well designed, highly integrated and easily maintained systems. It took about 5-7 years and had a few reboots, but it eventually landed. That brings me to my final point. These projects often have a timeline that is divorced from reality. Whatever time frame you think a major IT project will take. Double it. twice, then add 50% and you are close. It also seems that C level folks are hesitant to hire boutique/small shops that have industry experience and years of experience in favor of big consulting. Nobody every gets fired for hiring Accenture/Deloitte/PwC. What usually happens in the non trivial niches is that these big shops sleeve the boutiques through them to get things done...
This resonates so much and seems to be a major trend in non-traditional tech companies. I've mostly worked in the financial industry and the executives' knowledge of technology is almost always horrible. As you said, a couple buzz words and very set opinions on the ways to do things. It's like they get pet projects in their head from reading an article in a magazine and get locked into it.
I don't really have an issue with CXXs being ignorant about a subject. No one knows everything. What I do have an issue with is when they act like an expert ignoring all the people who are actually experts in a particular area. It'd be like me going into a room full of IT people and saying EBITDA a bunch of times claiming to be an accounting wizard ready to lead a major initiative. It's frustrating but I've learned all I can really do is smile and watch the show.
When reading these stories, part of me wants to give up and switch to the dark side. Instead of worrying about whether what we're doing is even useful for anyone, I could be earning money and prestige by leading large companies to deploy random SaaS solutions. What's not to like? I mean, except making your organization waste couple billion dollars and hundreds of man-years?
Cute. We actually built a data lake (with Python and MySQL) and immediately found problems we weren't even aware of, like (as I mentioned above) people getting the same email multiple times in the same day.
When our sysadmin left, we migrated all our websites from leased hardware servers to cloud hosting and were able to use that head count to hire a developer instead, who has built great new web apps for staff and customers.
I understand the temptation to be cynical, but these really are useful tools. I say embrace change; it's fun.
I work with a guy, that can't code, is a DBA but I have to fix his queries but when it comes to new projects, he has all the answers on how to implement everything and has not written a single line of code in an application.
Recently, I even told him just to tell me what he wants to achieve because his implementations do not make sense and its my job.
At least my boss also let's me do it my way, still annoying.
We've got a Salesforce implementation going at the nonprofit where I work. While there was some debate about which big CRM we'd buy, the need to consolidate was blindingly obvious.
Why? Because our organization has been quite forward thinking about allowing managers and executives to source the technology they think they to succeed. As this article advocates for, IT was largely consultative rather than dictatorial, and a lot of business units were able to pick what they wanted.
But what this has left us with is dozens of places where customer data was being stored, some of them now past their end of life. No central visibility into customer experience. People getting multiple copies of the same email from different departments using different email platforms. Poor deliverability. Subscriptions on random credit cards that suddenly turn off because the person left and no one knows how to get into the admin account and update the card.
We hired a boutique shop to do the Salesforce implementation; we're not scared of doing that. Unfortunately this time it did not pay off... their performance fell off, to the point that they couldn't even reply to emails on time. As sometimes happens with small firms, they grew too fast and exceeded their ability to operate. We can't wait for them to figure it out... so here we go with a big dog firm. Let's see how that goes.
Maybe I'm lucky in who I work with, but I find the "add a bullet point to the resume" take to be maybe a bit too cynical. Tableau, Salesforce, data lakes, ERP, identity management, and "cloud" infrastructure each seem like useful tools if implemented smartly. (Note that I took out blockchain...)
The CRM needed to be sophisticated enough to accommodate high standards for data security and access control, several marketing integrations, and the complex data model that resulted from a permissive culture.
I'm NOT an expert but my understanding is that, in terms of complexity and cost, there's a whole tier above Salesforce where you're provisioning servers and installing Oracle or SAP. We didn't need and could not afford that.
And if you're thinking of smaller CRMs like Hubspot, Zoho, Sugar, Apptivo, or building one from scratch, well, we already had many of those. :-) Those are what Salesforce is replacing.
Our IT department is superb on metrics like security and availability. But they don't know Salesforce, and are not the right people to evolve the broader culture associated with data. The org hired a leader with experience doing this sort of thing, and he is building out an internal permanent Salesforce team which will own the thing after implementation is done.
I mean fair enough and I'm sure they are reasons, but from a very immature point of view it sounds like you could have just picked a winner from your current CRMs and consolidated, and had a system in place that arguably already works, lowering your risk.
Still, as an IT contractor, better you pay the big bucks for the projects. :p
One of the side-effects I'm seeing of GDPR is a stronger incentive to consolidate systems under central management. Companies that allowed different departments the leeway to control their own systems now find themselves literally not knowing how many different places a customer's data might live.
Not necessarily. Another way of looking at it is that GDPR is forcing companies to eliminate a common dysfunction, while at the same time restricting their ability to play shenanigans with user data. The end result is companies that are more efficient at what they should be doing, and restricted from doing what they shouldn't be doing. A win-win.
(While many business people used to how things worked are unhappy with the changes, sometimes you really have to bludgeon a fix through the broken incentive structures that plague businesses.)
> Which in turn makes it easier to over analyse and identify user data, exactly what GDPR was meant to avoid.
GDPR "cares" about identifying user data - how else are you going to protect it and control access to it?
As for over analysing it (whatever that means), it really doesn't care too much about that as long as you have explicit meaningful informed permission from the user to do so, protect it properly, and let them control what happens to it (both during and afterwards).
> Nobody every gets fired for hiring Accenture/Deloitte/PwC. What usually happens in the non trivial niches is that these big shops sleeve the boutiques through them to get things done...
To provide a prospective as someone who works for a consulting firm like the ones you've mentioned... Hiring the "big" firms versus boutiques is a lot about a perception of risk, maintaining partnerships (procurement with new vendors is a nightmare everywhere), and leveraging experiences across other F500. For big implementations, think any ERP, our consulting teams can number 100+. Overkill? Probably, but there are few boutiques that have those kind of resources. And each of those 100 have more experience doing implementation work across a range of companies. It's a bit of a vicious cycle for boutiques where they will ultimately struggle to be competitive in these types of bids without seriously underpricing their services, which ends up meaning fewer resources or a less comprehensive scope.
As a side note, when projects go south, the company CIO isn't getting fired, but I've know many leaders (from the consulting side) getting let go for botching multi million dollar contracts.
Agreed. There is a perceived risk with the F500 so they may spend more to do something than is required because they feel they have recourse with the big shops. There are some good teams of folks out there, but there are also some shops that are happy to send a dozen folks @250-300/hr that generate process maps in visio and power point slides instead of delivering/implementing a project. I have seen these big shops deliver in a timely manner and I have seen boondoggles that waste millions. I don't think they are by default "bad", I just think they are not necessarily the right choice in some circumstances, and the right option is to hire the boutique niche firm that specializes in what they need. This lack of awareness is exactly what the article brings to light. CIO shoots down VP who needs timely solution for long protracted big rock implementation.
There's a strong case for decoupling the process flows, requirements, business analysis, scoping, current state understanding; from the implementation. Maybe even having a separate consulting firm do that work up front before going to bid on the ERP.
Many ERP implementations fail because of wrong assumptions about the business, or inflexibility of the client to modify their business process to fit the ERP. Enter expensive customizations.
These types of projects also affect peoples' jobs and that can bring up fears of being replaced that can quickly derail morale on a project. Successful projects empower the people with hands on the keyboard who know and live those processes, to own and define their future process too. When the consultants or executives are making future business process decisions without that experience, it is risky.
Sure, they get fired but are hired somewhere else because of the contracts they made botching a multi-million dollar contract. There isn't much of a downside to them for over-promising and under-delivering.
My experience with working with those consulting firms is that you start with 2 of them and at the end of the year you end up with 6 while still wondering why and having the test team somewhere offshore not doing what they promised so you end up doing it yourself as boutique firm
> One somewhat disturbing trend I've seen at some of the largest corporations- cut/outsource IT support staff to near egregiously low levels to "save money".
I see the opposite too, they just staff up on tons of IT people thinking they have a resource shortage, and end up with massive departments that deliver just a little as before.
> It also seems that C level folks are hesitant to hire boutique/small shops that have industry experience and years of experience in favor of big consulting.
The reason this makes sense is because they need to work with companies that have enough resources that they can be really inefficient and have enough capital that they can run for long periods of time and not go under. It’s more of an insurance policy, the quality of the work would be better at the smaller shop of course but they likely couldn’t complete it due to bureaucracy.
Digital transformations can be great. The problem is, anything great will be sold to you by consultancies as a way to give you more consultancy services, which is kind of the opposite of what a digital transformation should be.
This is exactly extremely common. In my company there is this constant battle about the devs having admin rights on their machines. We need admin rights to do our job. We have had dozens of meetings explaining the situation but IT can’t come up with a solution so the devs go around security because they have no alternative if they want to finish their work . Same with Dropbox. They block it but we have suppliers who use Dropbox. So the result is that people download confidential files from Dropbox on their home computers or phones and transfer them to their work machines.
In my view security shouldn’t be isolated at corporate headquarters but they should be close to the end users so they see what users need to do and help them to balance security with getting things done. They can’t just block stuff without providing alternatives or they will either hurt the business or they will be circumvented.
I'm one of those assholes that makes security policy. I deal with the same requests. The problem is, I write up a proposal identifying the risks associated with the exemption, along with minimum and recommended compensating controls. This then gets discussed among IT Management, where it is usually decided it's too much overhead, and to just deny the request or if the user can scream loud enough, allow it outright and get some director to sign something. The third oft-used response is ignore the problem and hope the user finds their own work around so we can get back to the 13 projects we're somehow expected to complete this quarter.
> It's an incentive misalignment. IT is evaluated in 'how secure things are' or 'how easy is it to maintain' or 'does this give me more headcount'.
I tend to summarize my experience with IT in large companies and universities like this: IT is evaluated in "how secure things are", or "whether things are not broken". The only sure-fire way to ensure a system is secure and not broken is to make it completely unusable, so that people don't use it. If people don't use a system, they can't break it!
Our outsourced IT has KPI's for closed tickets, so they are very keen to close any ticket for any reason. Then it is up to you to call and restate everything to first-level helpdesk again to reopen a new ticket in order to actually get things to happen, and thus they end up closing double the number of tickets.
Never mind that it took double as long, and sucked the life out of everyone subjected to the system.
We attempt to address this by making IT's annual bonus tied in part to our dev's project completion. It's not perfect, but the heart is in the right place. A big problem with this is that when we're dealing with limited manpower, we'd rather throw it at the easy issues than the hard ones, and ultimately get more things done.
It's proprietary software (adobe, apple/microsoft, autocad, etc) and they file it as a no fix. Or it's open source and it will take $30k+ in engineering time to change it and $100k+ in time wasted waiting.
There is also an inversion of responsibility here. It really should be IT's job to 'adapt the tool to the process' instead of externalizing it to their staff. Until they can do it, the staff needs outweigh the lazy 'default no' of IT.
if you buy a new erp, and it comes with a set of best practice process, and your management and employees all insist on transposing the new process onto the new software, in some cases why even buy the new software. its a different coat of paint on the same thing. youre buying a new erp for the change in process, but you refuse to adopt it.
I worked at a very bad company, where IT was aggressively incompetent, and mean about it. At best they were negligent, at worst they actively interfered with anyone who they thought was a threat, which included anyone who was more intelligent than them, which was virtually everyone since the company was full of senior EE/ME/RF/CS folks.
And I mean incompetent-- our network would go do down for hours, every day, and the problem lasted almost a year. It got so bad our "source control" became yelling "Whose got the latest main.cs?!" and walking it over with a USB key.
I realized we needed to disintermediate the IT group, and my boss was supportive. I ended up establishing our own, parallel IT department with a business cable line, some routers, and NAS devices. I outsourced any task to any cloud service (this was over a decade or so ago when that was somewhat unusual) — eventually hiring our own IT person even. People would ask for access to our network because it was more reliable :-)
Then things got vicious. They actively interfered with our group, trying to get us fired (they had some success with this previously). They were able to get our IT guy fired. Things just went on that way for years. They'd find some way to make trouble, and then we'd route around it. Once they even took my computer for two days, denied they had it, then returned it but locked me out of it. I was certain this was part of some scheme to get me fired, so I turned the computer off, zeroed the hard drive, and req'd a MacBook to work on.
They all eventually got fired when someone took note of their gross incompetence. One of their replacements was eventually fired for embezzling, proving that despite thinking we had the worst IT department anyone could imagine-- we really didn't.
Firstly, we had an exciting product to work on. Also, for selfish and foolish and stupid reasons? First of all, I was young. This was my first big gig. Most of all, it was fun. It was fun to keep winning against a determined enemy. At the time in the area I was in, we were considered to be one of the best software groups in our geographical locality. There wasn't really anywhere to go. The employer always ranked as one of the best in the county. Lacking humility, I thought it was simply my intelligence that brought the wins, not a mixture of my intelligence and mostly good luck. I can't imagine how many times I almost got fired but didn't for some reason.
To look on the bright side, my boss was amazing, a truly unique individual whom I thought so much of, I asked to be the godfather of my child.
Despite the chaos of the IT team, almost everyone I worked with was great. Everyone I met was really good at something, but I learned a lot from the people around me and had the opportunity to mentor a young engineer who ended up far exceeded me, which was rewarding in itself. I have several life-long friends from that company. People who have been with me through a messy divorce. A particular friend, when my wife left, called or texted every single day for six months-- just out of the goodness of her heart. The text was always the same, "Hey, wanna get some coffee? How are you doing today?" There are beautiful people working in even the most toxic places.
When I wasn't learning about engineering, I was learning about optics, about politics, and about the real world, and once again, I was winning, which was intoxicating.
So why ultimately did I stay? Because it was comfortable being the smartest person in the room. Because it was fun having latitude enough to start my own department? Because it was fun being engaged in this real-life game and winning? Because I'm crazy?
Things got far nastier than I went into. Once the IT department was openly trying to get me fired, things got personal. Really personal. It would have been easier to move on, but, I hate being bullied. Allow me to repeat myself, I HATE being bullied, and that's what this was in the end, organized harassment.
Through a set of unusual circumstances, I ended up befriending the main IT guys wife-- the man who decided he could play with my life for no reason at all, and his wife had a crush on me. So I and slept with her, and I am still sleeping with her four years later. To quote that great boss of mine, "Revenge is a dish best served every day."
No job is interesting enough that I would tolerate targeting and succeeding in firing a productive employee for internal politics. Doubly so if that behavior is coming from a department nominally in a support role. That's a dog to be put down.
I have a WiFi router hidden so we can do some specific network testing. We also have a consumer DSL line so we can test on a network where IT doesn’t block random stuff. It’s not even much of a secret. Everybody local knows this stuff and every six months a guy at IT headquarters throws a fit, doesn’t provide an alternative and gets ignored. I have thought about sending them recordings of previous meetings so we don’t have to repeat the series of same meetings every six to twelve months.
We had the same kind of "hidden" router at my previous employer, as we had a team of ~10 developers who needed a lot of different access to different servers and between eachother. We put the name of another tenant in the office building as the SSID :).
After about a year the network through that router started to slow down. When we checked why we realized that we had more than 40 wifi clients, as other (non-dev) teams learned about the network through the grapevine and actually the new employees from sales (the neighboring office) were told to connect to that network directly - as the process to connect to the "normal" network was too much of a hassle.
That reminds me about the story of a University that an old DEC VAX that got walled up during a remodel. Machine kept running and no one notice. A decade later another remodel came along and they tore down the wall to find the machine just happily running.
I work for a local government department. Must try your S3 workaround.
I swear sometime I think our IS dept are playing jenga, they just pull random services at will.
Today I couldn't update our website because the proxy settings allowing me access the login page somehow changed without notice. Last week they blocked USB access to machines without telling anyone who backs up to external 8tb drives. Tomorrow who knows what they'll decide.
And it doesn't make for a secure environment! Everybody tries to figure out workarounds. Staff actively try to undermine security policies. It's a total disaster.
You can switch it to any port you want. Problem is that it's super easy to spot on security monitoring tools. I deal with "SSH not on port 22" alerts at least once a month.
It's possible to get around this by tunneling SSH over other protocols: http://dag.wiee.rs/howto/ssh-http-tunneling/. Bear in mind if you do this in a corporate environment, security will throw the largest book they can find at you.
I used to work at a place like that, it was incredibly frustrating and time-consuming. So I came up with a solution that works even if S3 is blocked: I built https://github.com/OkGoDoIt/UploadAndPaste and set up SCP file hosting on my own server that listens on port 443. (They blocked most outgoing connections to non-standard ports and did MITM sniffing on any port 80 traffic, so this was the only way to get through.) Then I could just easily "paste" a file to a my remote server and download on the other machine via a url.
I'm still not sure why developers aren't on their own network for development. Have a red box / blue box type system at the developers desk. Given modern networking, it wouldn't actually be that hard to setup and keep development / integration / system tests (or what names you use) away from a locked down production would not be such a bad thing. Having some dual homed file shares wouldn't be that hard either.
1) I would probably still use different terminal color schemes, but being in front of a different physical box might be quite a good thing.
Developer data can still be confidential/sensitive, so you still need to monitor and control this second network with many of the same restrictions as the main one. You still have most of the same risks to compensate for, like data exfiltration and cryptolocker, etc.
It doesnt introduce that many positives for lots of admin overhead, not just in maintaijg two distinct networks, but also in ensurijg interoperability when needed.
Do you? Why can't you let developers admin their own network? At least some of them will know how to do the bulk of the work and a lot of the typical admin needed for Windows machines won't be that important for them (how often do they need to print something and how many of them would be unable to handle printer drivers themselves)?
It doesn't introduce that many positives for lots of admin overhead
For the IT department sure. But I'm 100% convinced a big reason tech firms exist at all (given that every company uses "tech" in some way, right?) is simply that they know how to manage developers and make them productive. And IT policy is a huge part of that. Sure, the developers might be 1% of a typical non-tech business but they are the 1% that can give you a competitive and productivity edge, so their needs are in many ways more important than other types of employee who may not scale well.
> Why can't you let developers admin their own network?
Being a good software developer doesn't mean you're a good network administrator or good at desktop support. And even if you are, a developer is paid more so it's a poor use of their time.
I'm all for devs having admin access on their own machines, there are too many instances where it's needed, and reasonable exemptions from default policies when they conflict with their work. But a "conflict" would be things like anti-malware software making builds fail, not merely making builds 10% slower.
> But a "conflict" would be things like anti-malware software making builds fail, not merely making builds 10% slower.
Given developer salaries, a friction like making builds 10% slower is hemorrhaging money. And it gets worse if someone is actually waiting for the software to be delivered, and the amount of money the company gets is correlated to keeping the schedule.
If you have a group of developers, it would probably be a decent idea to have at least one sysadmin dedicated to developer services. It would go a long way to make build deploys go a lot more smoothly.
Developer data can still be confidential/sensitive
Yes, there are confidential data, but it shouldn't be any real customer data. Right?!? Frankly, given best practices from professional developers, stuff like cryptolockers just aren't an issue (blank the machines). Developers need admin, so building a network for them is actually a lot easier.
> Yes, there are confidential data, but it shouldn't be any real customer data. Right?!?
But it might be. Whenever you're doing software other than for purely internal use, you have a customer that gives you sensitive data which devs absolutely need access to - like requirements for the software you're building!
My personal anecdata is that I've never worked at a place where I couldn't trivially exfiltrate real user data and source code without being traced. This is 20 years across defense contractors, banks, insurance companies, etc.
I understand your argument and in principle I agree with it, but in my experience nobody cares all that much about the data on the primary network, so creating a second network that grants devs things like local admin doesn't seem to increase risk by much.
Shouldn't this kind of thing be a problem for the managers to address? If you just circumvent this kind of nonsense instead of addressing it head on it just proliferates and allows the people who promote it to think they are doing an acceptable job.
At minimum you should inform your direct manager of the situation so they can address it or accept the consequences that the work that depends on the restricted resource won't get done w/o circumventing company policy.
I've heard security management say it is their job to say no all day. They definitely don't care about preventing work getting done. They will only get fired if a data leak occurs, etc.. Preventing work won't even ding their promo outcomes.
Whenever several people of a profession work together in one location, they tend to form a Guild. My dad couldn't plug something into an electrical outlet without having an electrician do it. This is human nature, and is as old as the hills.
The opposite scenario also happens. When security management's job is to never prevent work getting done, their inability to say no to even the most abusive practices can become an issue.
Cloning a production database full of private customer data for testing? Well, we can't interfere with a practice that gets features shipped and the team doesn't have space on their roadmap to build out synthetic data...
That's the point of telling your manager and letting them deal with it. Ultimately the business needs to make money or show some result. If the company policies prevent that then their cost benefit needs to be evaluated and that is a manager's job. "No" is fine as long as a conscious decision is being made with the consequences in mind. That way your security person isn't saying no to Dropbox access they are saying "No" to the successful implementation of the CEO's #X priority for the quarter or "No" to 10% additional revenue for the company or whatever.
> In my company there is this constant battle about the devs having admin rights on their machines.
One company I worked for had two lans. The admin lan and the engineering lan. Your admin machine would always work. The eng lan could go to hell, and it/management wouldn't know/care.
I think it worked well. It was kind of like having the common areas of the house neat and tidy, but giving the kids creative control over their rooms (and being able to close the doors when things got out of hand or guests came over).
> In my company there is this constant battle about the devs having admin rights on their machines.
I ended up leaving my last job over this and more stuff like it. They had a url filter in place for instance, that would randomly block access to our network resources. It would never be the same one so every now and again your stuff would start failing and you would lose a few hours debugging until you realized.... GAH! Then have to email and lose the rest of your day waiting for them to fix it.
The "dumb" here isn't even limited to "block Dropbox." Lots of my customers have blanket "block everything that could plausibly be used for file sharing" policies, and explicitly include services literally AIMED at corporate/B2B data exchange like Citrix's ShareFile.
No, we don't have an internal FTP site. No, I won't set one up for you. We use Sharefile for distribution so we don't have to do that. Your IT blocks it? Yeah, that's dumb. Go talk to them; it's not my problem. We're not going to do customized delivery channels just because your halfwit CIO decided to block every site with an upload button.
The issue here seems more cultural than technical - it seems non-tech firms are vastly more paranoid about data sharing or leakage, despite usually having less valuable data. The amount of effort put into anti-exfiltration measures in finance is staggering compared to what existed at Google, and it kills productivity to an enormous degree.
This seems to go hand in hand with a much greater obsession over IP. I've seen people actually threaten to start legal fights over whiteboard diagrams of little to no meaning at all. My guess is that outside the tech industry, new ideas are relatively rare so even very simple ideas feel incredibly valuable. This gets generalised to anything employees produce and is why there's no culture of open source development in most traditional industries.
Not really no. I'd guess most tech firms have far more ideas than they can ever execute on. Ideas are cheap. Implementations are expensive.
See how all the big firms file gazillions of patents but actual patent infringement suits between them are quite rare. Patents are seen as a defensive posture: everyone knows everyone violates a million patents so by and large, mutually assured destruction is avoided. In a world where ideas were rare you'd see patents be treated as much more valuable.
The funny thing is that they block Dropbox but then there are plenty of shady upload sites that aren’t blocked. We don’t use them because we think they aren’t secure but our IT guys would have no problem with that.
It's worse than that. I haven't met an IT person yet that wasn't as smart as the developers they worked with, except in slightly different domain. But the incentive structures are aligned in a way that makes IT's real job to execute cover-your-ass directives which freeze work. Doing the right thing is literally the opposite of what IT is being paid for.
And they should commit to talking to my VP every time the VP commits us to working with a supplier who uses Dropbox and also commit to finding solutions that allow us to get our work done within deadlines.
I have rarely found that they need admin rights on a day to day basis unless the tool is badly designed. I have one software delivery platform that requires full admin rights, it cannot write just to the user configuration!
However developers are very good at presenting it otherwise. Myself, I have run into issues where admin rights were needed, again always because of some poor installer. USB has been blocked as well and you can guess it, cannot do their work.
If they need admin rights all the time then put those particular machines on a protected network and not allow any other business work to occur there.
I think it really depends on what kind of development you are doing. I worked at an IC company, and among other things we developed (and tested) USB drivers for our devices. We installed them on literally hundreds of windows machines for testing before they were signed/ whql approved. This happen all the time. So really no way do our jobs without admin passwords. We did have the machine on isolated networks, but even on the regular networks many of the team member frequently would need to hand install drivers. This is just one of many examples. If you do anything even remotely hardware related, it can really be a totally different problem. We had IT installed USB filter drivers (we were not told about for security reasons) that actually broke a lot of testing in our labs and it took us months to figure out.
On Windows you often can't even predict if you need admin rights for something. I have had plenty of cases when I tried something and couldn't get the f...ing thing to work. Then try with admin and suddenly it works.
The problem may actually be compliance requirements. SOC2/HITRUST/SOX all mandate the removal of admin rights from computers, mandate an approval process w/ manager approval. Regulated industries, especially banking have more security-related compliance requirements causing a lot of the pain.
Unfortunately from a security perspective devs and system admins are probably the highest risk targets since they typically have access to servers and admin rights. At the very least they have source code an attacker could analyze, and likely have access to external services.
The reality is that compliance, security and usability are often in direct conflict that can only be solved to make everyone happy with significant work.
> SOC2/HITRUST/SOX all mandate the removal of admin rights from computers, mandate an approval process w/ manager approval
I’ve heard this before, but never with any detail. Can you explain further, or point to a resource? For example, clearly SOX doesn’t say that nobody can have admin rights - because IT does. And I highly doubt that the law says that only departments with IT in the title can have admin access. So what does it really say?
SOX doesn't actually say much about IT at all. It says mostly that you need internal controls to maintain the integrity of financial information. Anything more specific than that, around IT, is just someone extrapolating out what they think a good set of internal controls are.
Mostly when you hear "because SOX", that person has never actually read the document.
tyingq summed it up tersely pretty well in the sibling comment. Specifically for SOX, it is all about financial information, but that information is stored and manipulated using computers, so IT related controls end up being part of it. It's easy to get away with sourcing your controls from an industry standard list of controls appropriate for SOX, but these are often significantly behind the times and don't acknowledge the state of the art.
These things are all based around "controls", which sometimes can be specified locally and some times are set by outside entities.
In my experience (SOX evidence collecting, and SOC control writing and evidence collecting), a well written control is one that covers the bases without being overly prescriptive or ambiguous. In the case of admin rights on computers, it is more useful to have the control be worded as "Only appropriate people who have legit reason to have high levels of access do" and the audit step is confirming and providing evidence that the people who do are documented and approved as having it for specific reasons, and that people who aren't supposed to have it don't have it.
I've run into crappy controls quite a bit. It's easier to push back on these when you're determining the appropriate controls than it is when you're the sucker who has to collect the evidence that doesn't, and won't, exist. Authentication and authorization controls are often some of the worst. A less useful/less meaningful control is one like "all accounts must have passwords and all passwords must be at least 12 characters long and be composed of a mix of alphanumeric and at least two punctuation characters".
The goal is to say "yes, we do this, and here's the proof" without any qualification.
Sorry, none of our accounts have passwords because we disable password authentication and use ssh public keys for authentication with two-factor via Duo. If you say that, you don't satisfy the control as worded (because none of your accounts have passwords, you can see this in the shadow file and sshd has PasswordAuthentication no, and this is difficult to explain to people who are not familiar with ssh, which is, unfortunately, a significant portion of the people who end up being put on audit projects). If you say that they do have passwords, you're lying/misrepresenting, which isn't good for an audit either. If you say you don't but have compensating controls, this doesn't look as good as it could because it is called out with an addendum/explanation and is a potential exception that needs extra consideration.
These controls should be worded more like "all users have their own accounts and accounts are authenticated using secure methods" with sub-controls specific to the environment/company saying things like "password policy is based on NIST suggestions as of YYYY-MM-DD and enforced via <company-policy-compatible enforcement mechanisms>". The point of the controls is to detect, catch, and re-mediate anomalies, it is for this reason that the controls need to be adjusted as standards change and the state of the art moves forward. The specifics and rationales for why something is in place is for policy documents, which means people usually don't understand why a control is worded the way it is and poorly worded controls make for really rough, drawn out audits.
The cargo culting is unfortunately too true. SOX/SOC reporting exists for a reason and it's actually pretty easy to get real value (which is the intent) out of it, as it formalizes what you should be doing anyway. It's a really good feeling when appropriate processes/controls reveal things that fell through the cracks and they get remediated. Prepping for and performing a successful audit needs to involve the company's subject matter experts from multiple departments. If only the CFO is involved early in the process, it makes life harder for the CTO, CISO, and CIO (or whoever they delegate to) later on.
This is about Windows desktop development. I need admin rights to install sql server, I need them to customize my machine so it’s similar to our target environment. I need to change user permissions all the time yo see how things behave under different conditions . There is a ton more I could walk you through and have done multiple times. Comments like yours come repeatedly from people who don’t know about the work we do. I have offered them to demonstrate doing our job without admin rights but so far nobody has even tried. They just keep sending the same email about not needing admin rights which has repeatedly been showed to not work.
Being punchy about it you've never moved on from thinking you need admin rights on your machine. Chocolatey for Business' self-service installer, SCCM jobs, and a variety of other tools exist to enable you to get specific things that require elevation executed. If you're changing things to test various configurations wouldn't it be handy to have those scripted, get them peer reviewed / linted and you've got yourself the start of a process to get that script executed on demand.
This stuff isn't that hard - but those of us doing it see the mad things that people do when they're given blanket, even time bound, admin access. They're the ones dealing with the support calls when then every SQL Server installation has been done differently with no details of what specifically was done. IaC works.
Then I hit another 'no-admin' roadblock, that requires a day or weeks of hostile IT bureaucracy and the IT department has just wasted another +$3000 of employee time. This behavior might drive them to quit, leading to a premature +$30k recruiting and on ramping cost to replace them.
Now iterate that over 1000s of other instances and you see the financial reason why devs need admin.
Have you ever considered that some people are themselves writing tools like Chocolatey that inherently need elevated rights? I am working on a Windows service that needs to be elevated to work. In addition I need to change TPM keys and change registry settings in the machine hive. The SQL Server installation is local and IT will never be bothered with it. Just let me install it.
Out of curiosity, what's the benefit of me doing bad things in a VM, instead of on my own machine - assuming the VM has full access to the same networks and data as the physical machine?
Unless the VM is somehow sandboxed it's just another box on the same network. So the same reasons for me not being admin on the physical machine (e.g. to not be able to download and run untrusted software because it might spread something on the network) should apply to the VM?
Of course the VM is isolated. That's exactly the point of a VM.
An account inside a VM will only let you play in that VM.
Whereas your account on the host is available and automatically granted access to all machines, fileshares and services on the active directory network. If it got admin rights, then you've got admin pretty much everywhere.
“Whereas your account on the host is available and automatically granted access to all machines, fileshares and services on the active directory network. If it got admin rights, then you've got admin pretty much everywhere.”
Nonsense. You can have local admin rights that work only on one machine.
Let's assume for the sake of discussion that to do what I need to do I not only need to install the program that requires priveleges, I also need a few of my company network drives mapped, access to some company systems, internet access and so on.
If you want a proper dev environment that matches your target you need a proper server to have sql server installed on. I'm pretty sure someone can install sql server on your workstation if you really need it. User permissioning is a dbo task. After that you just have to live with it like the rest of us.
You sound exactly like every other IT guy who doesn’t understand what we are working on. We then explain everything to them and usually they disappear and are never heard of again. That is, until the next guy shows up a year later and the cycle repeats.
Corporate IT can admin the box for corporate training PowerPoint gunk. You get another box to run what you have written, and maybe another to run the development environment. Those don't go on IT's network. You can run a private LAN around the office, not connected to the outside world, in which you break things as you please.
This solution is even good enough for people who are intentionally dealing with malware.
I was a dev in the 90s and the start of the 2000s and always had admin rights. I dont need it any more. If you really had an edge case that requires admin rights I'm surprised. If you really need SQL server on your workstation you should think about using a different database. If your company says you have to use SQL server and you have to have it on your workstation and you need to reinstall it regularly and you're obviously screwed you go up the management chain with your unsolvable problem that breaks their policy. Is very unusual now - most people just moan they want admin rights when they can live perfectly fine without it.
You're basically saying, "I don't need admin rights anymore and can't think of reasons why anyone else would, so clearly you're wrong, don't know anything about your work, and don't need admin rights either".
Try running Visual Studio without admin rights and you will weep. Regarding other rights, I tried to onboard a new Dev without admin rights, however, after the 25th IT ticket (that take days to get done), I gave up.
I run Visual Studio without admin rights every day. But, if you're doing driver development, working with older IIS, certain parts of the registry or developing installers then yeah you're going to have a bad time.
I'm not sure if your "almost ten years ago" is meant to be hyperbolic, or genuine... I can't even remember why, but I know the project I was on 6 years ago definitely needed visual studio to have admin access, and it was all standard C# app stuff (maybe WPF?)
One example off the top of my head: It used to be (may still be) the case that you needed admin rights to install and run the Windows Subsystem for Linux. Sure, you might not need this to do your job, but IT is not really in a position to decide that. It could be that WSL greatly increases your productivity.
> lots of applications can be installed as a user.
Because most of the non-insignificant ones still CAN'T be, under Windows, to this day. So special people get a completely separate account with pseudo-admin rights. I have to enter those credentials several times a day.
Then I spoke to a help desk guy, who said he had to enter his domain admin account password 40 TIMES a day.
Let's ignore the SaaS security issues for a second. When IT says "No" it's not like the area asking is going to go away and not try to solve their problem. Organizations are going to find ways to solve their issues and IT can either help from the beginning or help clean up the mess later. I try to take the stance of offering the right solution and a lot of the times a now solution at the same time. There is no saying "No" in the long term, either help them now or get stuck with the shadow solution the magic macro guy cobbled together that became a critical business function.
I have seen IT being unaware of and unwilling to meet requirements of highly specialized technical teams, such as network engineering. You cannot have a TELNET client because the use of TELNET is prohibited by corporate policy, test TCP connections another way. You don't need vim when you have vi. You can't have admin rights but we don't support drivers for RS232 dongle so nope. Sometimes it's quite a challenge to get some work done.
There can be a lot of steps between "help me fix this problem" and fixing the problem. They include qualifying the idea, scoping the request, possibly transforming the request into an abstract form and searching the organization for other people with forms of that abstract problem. Then you get to procurement and you need to figure out if you are getting a specific tool for a specific task or some kind of kitchen sink. Now that youve learned what the kitchen sinks do, you go back to scope and decide if other requests or projects are moving into this one, or if perfect is becoming the enemy of good. Then once your implementation project is done, you need to redesign old processes around it, communicate the change, and offer training.
Saying yes and fixing the problem quickly, without analysis, will often UNDERSERVE your company in the long run, because you only fix a specific problem for a specific person, functional group, or division.
You're essentially suggesting "properly" going through the slow and inefficient bureaucracy machine, when the underlying issue is exactly that people tried to do what you suggested but ended up getting nowhere.
You realize you need a kitchen sink, it turns out there are 2 other teams who already created their own semi functioning kitchen sink which they want you to adopt, but doesn't fit in your kitchen, and the CIO is working with Home Depot to create standardized kitchen sinks for the entire company which will be ready in 3 years (realistically 5 years, or maybe never), but your current sink is leaking and is flooding your kitchen now, so you do what you can to fix it.
Basically a principle of asking for forgiveness rather than permission. Does it cause issues? Of course, any solution to today's problem becomes tomorrow's problem. Now there are 3 semi functioning kitchen sinks in your company, but at least they are functioning
If your machine is slow and inefficient, it needs repair or resources. Fix the machine, don't build a smaller one propped against the side of the garage. A rising tide raises all boats, put your effort towards improving the company by rising the tide.
You are adding redundancy and overhead elsewhere. Now you have 30 people at your company playing account admin, managing passwords and permissions to different platforms. "Oh but it only takes me a couple minute a day" x 30 = a part time admin job.
People tend to ignore succession too. What happens if trains hit people, or they land poorly skydiving, or they move on to another job. Oh they signed up for that critical service with a personal gmail account?
Im not even advocating kitchen sinks in my post, and neither was the article. The article was pretty explicit that there are the two types of projects big and small, and to let small projects be agile, but still keep them visible and sanctioned. They dont need to be shadow projects. You are creating a false dilemma. Its not "either it goes into the kitchen sink or its shadow IT." The machine may very well recognize "hey two other teams are working on variants of this, can you all get together and share notes. We would also like copies of everything you have all learned for when we re-implement this in three years." A good IT rule might be "sure you can sign up for your SaSS service and manage it yourself, but if it supports SSO, we are implementing SSO or you dont get to sign up."
If you need a stop gap temporary project to last you until the ERP is implemented, good management will recognize that and support it as part of a continuous improvement cycle. They might buy something damn well knowing that they already have a plan started to decommission it in two-three years.
And what happens when you don't have good management? You end up with kludgy solutions and IT constantly falls behind.
As a dev, I will get my job done, and if that means breaking company policy, I'll do it. I've used SOCKS proxies to get around company firewalls because the whitelist time is measured in days, and I have minutes. I've used my phone as a hotspot when IT broke enough of the internet to be a bother. I've used SSH tunnels combined with Nginx reverse proxies to get around routing approval processes. I've even built a port forwarding service because IT took too long to approve and implement their own, and it has been in production for years (I don't think IT is aware of it, though I should probably get it all cleaned up at some point).
If management decides security is important, but not important enough to make efficient, employees will work around the limitations.
> People tend to ignore succession too. What happens if trains hit people, or they land poorly skydiving, or they move on to another job. Oh they signed up for that critical service with a personal gmail account?
Agreed. Not saying it's a good solution, just saying it's a solution. Can you get burned by that solution? Yes. Usually is the reason many companies have extremely stringent and inflexible IT policies, because they've been burned hard in the past.
> good management
I feel this is really the critical component that is needed. Management, just like any other skilled labor, like programming, will always have an overabundance of mediocre performers and a shortage of 'good' high performers. Meaning we run into issues described here.
The audience of Harvard Business Review articles are managers, probably ones with some level of self awareness with regard to self improvement, and this article is written to business management as a set of recommendations on how handle stodgy IT departments AND rule breakers.
The article we are commenting on is trying to teach management to identify when things need to move through the entire IT project lifecycle, and when to let them mostly self sustain (except obviously nobody should ever sign up for a single thing no matter what it is, if it supports SSO, until the company connects their identity management to it!!)
In my last job I tried to handle this in a similar way. The issue we ran into though was that often these managers would not properly evaluate the software. They would get wowed by the sales guys and sign up for huge contracts without, sometimes, even checking with IT or testing other vendors.
Whenever I'm wearing my IT hat at work, a big part of it is being a detective looking for clues that somebody, somewhere, is about to do this. Then I can insert myself into that process. It would be preferable to be included in the beginning, but one must work within the reality of the situation.
I still shudder thinking about my time working as a developer on corporate IT locked down IBM leased laptops. Every time I did npm install I needed to request admin access to Windows which took 2-3 hours to action by IBM team sitting on the other side of the world in India.
One day a grey beard took pity on me and installed a Linux VM where I was admin, copied the security certs from the Windows host and I could access all corporate resources at my leisure. Never logged a single IT Helpdesk ticket after that.
Honestly on my work machines, though we have root, the difference in FS performance tilts in favour of VMs/containers due to the slow endpoint protection affecting native FS access, and for a lot of tasks we do that outweighs any virtualisation overhead.
Our IT security department was incentivized to deny everything from new tools to new internal applications.
We had an outside firm making security decisions and if there were any security issues it would end up being on them. So as long as they did not allow us to release any products and or install any software they could not be held responsible.
I made friends with a lower level contractor who told me off the record to use my judgement on what to install to get the job done, because the security department would never approve anything new unless directly instructed to by the CEO.
Fast forward three months, there was a major security flaw on our website (also built with outsourced labor) which allowed anyone to access private data without a login.
A few of us had reported to the security department that the code running the website was so poorly written that the odds of being insecure were close to 100 percent. We suggested upgrading the website and rewriting the code, and management was on board with this but security department refused to allow us to use any new frameworks since they were not approved. Of course in a matter of a few months the site was hacked and millions were spent as a result.
I quit this job after we were unable to release several products after a year even though we jumped through every hoop we needed to. That department killed all innovation.
I think it really depends on the company. If you're something like a nontechnical non-profit, sure, turn that decision making over to IT. In that case IT is performing a vital, skilled function.
But in most software shops, the workers are probably more qualified than the IT department to be making decisions about what applications to use, and what kind of security they need. IT is just there to make things run and fix them when they break. They don't really need to offer guidance.
Joining Google was an eye-opener for me on this. Was the first time I encountered an IT department (TechStop) that didn't act like a police force and instead had your back, helping you get where you needed to be. Was always the first thing I would show guests on a tour of the campus.
TechStop is/was great. But Windows users had locked down workstations where IT whitelisted binaries. I assume the approval process sucked about as much as normal.
Many Google employees use desktop Linux which is basically unheard of outside the tech world. That by itself simplifies things quite a bit. Not many people writing viruses posing as screensavers for Google's in house Linux strain. Anyone who cracks that is probably an APT attacker and those require different approaches.
Its the user who downloaded a program they "needed" which had malware which sent out a lot of spam email because this was a user that did announcements which basically got an e-mail server listed on blacklists that creates these IT policies.
You want to treat people like responsible adults, but they aren't the ones who have to deal with the fallout. Developers know the score for the most part, so full privileges are expected with the caveat, if it all goes bad, we are wiping the machine, not doing a recovery.
IT dreads the moment we are called to account for something some user decided they needed to do.
1) most developers understand backup tools and code control - those that don't, well...... with great power comes great responsibility
Yep, a company I worked at hired a tech writer that downloaded some cracked version of software that included ransomware on their first day of work because they said they didn't want to wait for the company to get them a legitimate copy.
Yeah, what I meant is that, these days, the culture is such that one assumes there will be an OSS tool somewhere, before one even considers a sketchy binary. Maybe the OSS option will be inferior, but it's almost guaranteed that it will get some stuff done and not nuke your machine. That's a significant improvement (of course we know that having a github repo is no guarantee and blablabla, but it correlates well enough for most purposes).
To be honest, I find it odd when you treat it as if everyone else that you work with is a customer. I don't believe in this philosophy. The business is my customer. The business is what IT is trying to protect. If you have individuals that are not following policies, they would be disciplined like HR would discipline for not following policies. It's all in place to protect the business and what's best for the business. Sure you'd like admin rights to your own machine, that will help you individually, but will it help the business as a whole if we get hit with cryptowall again?
I find most “IT security policies” that hamper developers to be mostly security theatre. No matter how many policies they put in place, since they aren’t developers, one junior developer can write:
var sql = “select * from Customer where firstname = ‘“ + firstname + “‘“;
And thwart all of your security “best practices.”
I was the lead dev at a medium size non tech company, and the hoops I had to go through to get anything done dealing with the “security team” was ridiculous and of course I didn’t have access to production to troubleshoot for awhile.
I had ultimate control of all the code that did go through the process. If I were to do something stupid or purposefully malicious, while I didn’t have access to the environment - my code did.
As far as someone mistakingly installing a “crypto wall”, if a user can download a program that doesn’t require admin access, that program has access to the user’s files. The system can be restored much easier than the user’s data.
I find most “IT security policies” that hamper developers to be mostly security theatre. No matter how many policies they put in place, since they aren’t developers, one junior developer can write...
IT policies at large corporations aren't implemented for developers (only). They're implemented for everybody. For every developer, there is a salesperson, admin, manager, or HRBP who will do things they might not fully understand to be "bad".
I came into the industry in the late-90s and still remember the chaos that the ILOVEYOU and Anna Kournikova style viruses caused in corporate offices. Non-technical users didn't know that Windows hid file extensions by default. They didn't think that opening a picture could start a shitstorm the brought the corporate network to its knees. Fun times.
I agree that the current systems and policy for security is in-efficient. It seems that Security policies are mostly roadblocks to production, roadblocks for developers. It's a sad state at the moment and that I absolutely agree with. In this case IT isn't as worried about the users data on that machine. We're worried about the state of that machine taking everything else down with it. Users data should be stored on the network, some data may be local. A user with local admin access and installing malicious software has a higher risk of propagating everywhere. This is what I notice where a divide is between developers and IT. You must change you perspective. It's not a single user we're talking about, it's everything, the integrity of the system and the integrity of the network is based upon the integrity of every node on the network. A vast majority of the threats faced are user based. Somebody clicked on a link, somebody was spear-phished. The biggest threat to IT Security is ourselves.
Users data should be stored on the network, some data may be local.
If the user has read/write access to the network, so does anything the user run.
A user with local admin access and installing malicious software has a higher risk of propagating everywhere.
A sibling post just use an old example of the ILOVEYOU virus that didn’t require admin access to run or spread.
Somebody clicked on a link, somebody was spear-phished.
And if that happens, and if the user gave up their username and password. The perpetrator has access to everything the user has access to. The perpetrator will probably target a user with the access they desire. You say enforce two factor authentication? That’s also easy to scam out of user - get then to tell you the 2FA code. It was happening to Uber drivers.
If you can’t trust the user not to do something stupid, you can’t trust anything that the user runs not to do something malicious or be tricked into giving up confidential information.
Implicit in protecting a business is that the business continues to exist, i.e., that it's run competently and can hit revenue targets, it can grow, etc. Focusing on rules & decorum is playing from behind, rather than thinking about how IT can become a trusted partner from inception (so that you are out ahead).
BTW -- if IT's goal is really to protect the business, then you should find & discover the ways people are getting around your fences, because the first thing that a malicious actor is going to do is find & hop those same exact fences. These people finding security holes should be lauded as whitehats finding your mistakes, not people to be punished for not following rules.
>These people finding security holes should be lauded as whitehats finding your mistakes, not people to be punished for not following rules.
Yes but often if that whitehat reported it and they closed those "holes" you wouldn't be able to get work done, because you can be sure that they wouldn't go the extra mile to create a system where you can do stuff, they'd just close the "holes".
I don't think it's reasonable to treat everyone you work with as your customer, but that's not what's being proposed.
IT's role is generally to support the organization. The organization is its customer. For the most part, it doesn't "work with," but it supports. In any organization, there's a complex network of who is a customer, who is a client, who is a peer, and so on.
There are places I'm not IT's customer, but they're the exception rather than the rule. If IT isn't providing a service I need, then that's a failure of IT. At the end of the day, the fallback is to purchase the same service elsewhere. If IT needs to know about that (e.g. for audits or security), it's fine to have a process for that (I report to IT, IT verifies what it wants to), but if that process becomes an unnecessary roadblock (IT doesn't want to compete for my business, rather than a core security issue), either people will circumvent that process or the business will take a hit.
The customer-provider networks vary on business. In some cases, engineering is the customer of marketing, and in some cases, the other way around. You have companies where marketing decides what to build based on customer conversations, and engineering builds it. In other cases, engineering decides what to build, and marketing sells it. And then you have all sorts of cases in between, from synergistic peer relationships to all sorts of balances where one drives but the other informs.
That doesn't change the gross organizational dysfunction being described in this article.
Boxing IT into a support role minimizes its potential contribution. If business enablement is the goal, that includes innovation, business development, and fixing what isnt broke. Coming to management with new business ideas instead of either waiting to be handed something, or only moving forward with ideas because they address risk and security.
I don't quite think you understand the point of a support role. People who support me do a lot of innovation, fixing what isn't broken, and all of that. Most are highly empowered and I expect a few to take serious leadership roles in the organization, depending on seniority.
The primary question is one of purpose: someone in a support role is hired to keep me effective and productive, and evaluated on their ability to do so. I am their customer. If I win, they win.
The goal of IT isn't good systems architecture or innovation -- it's me. Supporting me well often requires good systems architecture and innovation. It also requires compromising those at times to my goals, having clean transition strategies, and similar choices as well. Those decisions are made based on their impact on me.
You are using IT as synonymous with Support/Helpdesk. Do you have an enterprise architect, do they report to the CIO? Maybe its the COO? Do you not consider systems architecture a part of your IT department?
I absolutely think you are wrong that innovating is derived from your needs, a technology group driving innovation could just as well include obsoleting you. An IT department can bring new business ideas to a leadership team, and arguably their leader is a part of that leadership team, at least until every c-suite person is tech savvy enough to obsolete the CIO position.
I am using IT to refer to more than support/help desk. It includes, for example, having a working network, email, and CRM. It includes custom database applications. It includes an internal wiki and an external web site. It includes lots of other things.
All of those things are there to support the business, not the other way around.
No, CIO role often carries responsibility for security. VP violates policy is like skirting regulation - yes it cost less money, but for all you know they are not compliant with policy and aren’t doing the whole job.
However it does often seem like IT doesn’t consider SaaS solutions - they always want to build something their selves without doing cost analysis.
I have to use SaaS solutions for work, and the security situation terrifies me. I have to put my corporate password, with access to all sorts of important stuff, into a sketchy 3rd-party web site. This looks mighty bad.
Properly implemented no you would never do that, you would use a trusted SAML auth server to Authentication with your Domain Creds,
Something like Azure AD, ADFS, or 3rd party (that you assume to trust) like OneLogin. In all cases you would never enter your password into the SaaS service you are redirected to a secure portal controlled by the Auth Service, a token is then issued back to the SaaS service
Further it would be recommended not to use an elevated account and certainly not something like a Domain Admin account for those services
I have the opposite experience - most IT I know would rather outsource as much of their job to "the cloud" as they can, and go feet-up.
The problem is typically that cookie-cutter solutions don't necessarily map what the leadership requires: either the cost is too high, the knowledge gap is massive (e.g. the tool can do everything, but requires specialized knowledge of an obscure DSL and implementation details only three people in the world have actually mastered...) or the security implications are nontrivial.
To be fair, I do know also people who will always prefer to build their own anyway, because it makes them feel more in control (which they are). It's the CEO's job to rein in these tendencies when necessary, though.
The security triad is confidentiality, integrity and availability. If a security expert doesn't make sure that their security policies give users access to the things that they need, then they are only doing two-thirds of their job.
If a large part of your job is security, and your "customers" had opted to start stealing product off the floor because it was "easier than waiting in a line", you would be fired for not bringing it up.
Thats the situation the CIO had to respond to. Just because its not part if your role to consider security implications of these SaaS services doesnt mean he's out of line for doing so.
IT Does consider SaaS solutions. When the business executives see the cost of the solutions, the business leaders decide to roll your own. SaaS isn't the end-all be all for everything. It's all about value add and achieving a goal at the end of the day. Trust me IT would much rather roll a SaaS solution, far, far less of a headache and less overhead for the department.
In the situation described in the article, the manager had gone to the CIO but CIO refused to help them.
"The CIO admitted that he had been approached and explained that he had informed the VP that IT already had a project with SAP to deliver what the VP needed. “Yes, but that won’t be ready for me to use for three years, and I need something today,” retorted the VP."
The manager had a valid and valuable reason ("increased revenue $1M per month") to require a service that CIO and their organization was unable to provide in a reasonable timeframe, but other companies on the market were.
The problem is bureaucracy and unwillingness of IT to be agile and responsive. Their weapon is procedure which quashes initiatives. At the same time though, going outside like that can have major effects on compliance if they are subject to audits, aside from the security considerations.
The "customer" doesn't care about IT and wants to do things behind the back of the CIO. The "customer" goes to extreme lengths to hide the fact that he is violating company policy by purchasing SaaS with his own credit card. If that employee leaves one day, all the data inside the SaaS is gone because nobody else knew about it. A malicious employee could also use it to extort the company.
Security training focuses way to much on email phishing and not enough on this kind of stuff. Actually getting your work done, managing your own computer. Of course people can't be trusted if they havem't been trained. How to handle USB drives. What and from where you can download and run programs. What actually IS a program and what isn't.
Many of us learned this the hard way by playing lots of cracked games in the 90s. But not everyone did that.
Try explaining to a non-technical person how how a desktop background image isn't a program so it's basically safe to grab from anywhere, while a screen saver is definitely a program and usually unsafe to get from most places, and a word document is some times a program that might eat your computer. Training could involve things like "which of these 5 webbpages would you consider it safe to download and run executable from"?
Having too cumbersome rules around security just means it's ignored or circumvented, increasing risks.
It's tough, I give security awareness trainings myself and I completely agree with what you're saying. However, that's a lot of information to give to a group of new employees that can span any department and technical understanding.
I actually was talking today with a customer during a logical assessment about if I talked about downloading malware in the training. I dedicate an entire section to downloading documents, but I don't really give people the information you're talking about. I tell them how to avoid ever having to download anything, and if they must do it, how to try and do it properly. All of this is ended with the process on how to report incidents because eventually something bad will happen.
As a company you kind of expect this to be solved at a number of layers. Endpoint management should hopefully help resolve this issue. Restricting web access where it makes sense can help. Sec Awareness Training helps keep people aware. Etc, etc, etc. You hope your controls are what save you from incidents, because there is no way you can effectively train your entire company on security topics to a degree that they can make good, security conscious decisions. That said, many of these SAT's are really just checking compliance requirements, because thats the real need. I put my own training together starting with what I know needs to be covered for compliance (pii handling, passwords, acceptable use policy, common threats, security incident response reporting, etc). Anything else that makes it in is purely because I have extra time and I know it to be important.
Concrete example happened just this morning: I needed some documentation that exists on archive.is, but has been taken down from the original site. I navigate to the cached content on archive is, and archive.is is DNS blocked when going through my VPN by Cisco Umbrella because apparently it's an "anonymizer" service.
So I change my DNS settings to use an 126.96.36.199 dns first, and my company dns second. Now I can access both archive.is and sites on the company network. Excellent. But in doing this I circumvented all the DNS filtering, not just for this site. The reasonable thing would have been a warning like a https-style warning "Are you sure you want to continue to this site"? Or a way of whitelisting, perhaps temporarily, a single address. Instead my options were to ask an administrator or disable the whole security feature entirely. (Or connect/disconnect the VPN temporarily every time I needed something blacklisted, but that didn't feel like a good solution).
This assumes that a company has a culture that allows training to happen and affect change, instead of being pushed off, ignored, or laughed at.
And training on different file/executable types isn't effective. Many high profile phishing attacks have been carried out using malicious attacks embedded in innocuous files like word documents or PDFs. The only way to actually prevent malicious code from a download is to prevent the download in the first place.
This brings back a memory. The only time I was fired "for cause". Summer after my freshman year at college, I was temping and got an assignment doing real estate purchase comps with a company in the East Bay. At the time, there were laser printers, but often printing sucked up CPU time and let's just say multitasking was still not a widespread thing.
I found myself tired of sitting around. I found a TSR / print spooler that would use RAM and offload the process of printing. This allowed me to keep working. My productivity (as a temp) was higher than many others including the person I was "reporting to" at the company.
They found the print spooler, labeled it "unapproved software", and I was walked out the door.
The funny thing is, a friend at the time (and I didn't realize it) was higher up in the management. He reached out to me on a multi-line BBS that was popular in the area and offered me a full time job a few days later. I was in school and obviously declined.
Working the rest of the summer for a Chemical Engineer in Martinez/Benecia ended up being incredibly more interesting. So it was a net win.
Developers don't need admin rights for much of anything in this decade. No need to bother with that.
Common software has to be made available in self-service, so developers can install development tools like notepad++ or visual studio.
Deployment is usually the challenge because you have to store binaries somewhere, copy it to some random servers and finally execute it, each step causing numerous security headaches, so there has to be some approved tooling to handle that.
Developers don't need admin rights for much of anything in this decade. No need to bother with that.
Please defend this position.
My experience, mostly with Linux-like tools, is that those tools are built with the assumption they are being used by someone who knows what they are doing, and that they have the appropriate level of control of the machine -- they are tools for professionals to build tools.
If you don't have rights to install or execute them, you're done. You can't make any forward progress.
Alas, I'm in a similar situation with my current stint and looking for an exit.
The most maddening part for me is to literally sit around helpless and unable to do any development because you need to wait for your IT support ticket to be looked at. Then having to explain to your manager why work is behind schedule.
However, idle time alone doesn't seem like strong enough reason to open discussion on changing IT policies.
I just witnessed a very similar situation, on a smaller scale, but there are many of these in my company, and they add up.
Boss: "We need access to the database of our primary application that you wrote for us so that we can pull the data into this new tool to track progress."
IT: "No. Not only can we not give you access to YOUR data in YOUR application that we wrote on YOUR dime, we will not allow you to have this new application written by someone who isn't in our group. If you wanted something like this, why didn't you just ask us? We would have written this for you."
Boss: "We had a meeting about this over a year and a half ago, and you told me that you didn't even have the time to discuss it further."
IT: "... Well, we're still not going to let you do this."
IT is effectively holding the rest of the company hostage, and the corporate technical debt is becoming epic. So skunkworks solutions will continue to be developed.
The ideal IT team is one that proactively learns the needs of others in the business and works with them to solve problems. It's no wonder that so many companies end up with shadow IT when so many IT teams are just people who tell you "no" whenever you ask for something. Doing it right is harder in the near-term, but much easier in the long-term as you're not putting out so many fires or going to "disciplinary council" meetings.
Great until you have five teams, each having chosen a different tool, and now you're wondering why the IT support costs are out of control.
Still possible to support but requires a different model e.g. one where IT delivers a new, unconfigured workstation to your new team member and it's up to them to build it. If it breaks, their loss of productivity is their problem and not ITs.
Authority for something (e.g. software selection) must go in conjunction for responsibly for consequences arising. Those things must always move in lockstep to avoid perverse outcomes.
Our IT does not support development environments. Just network, backups, printing, client certs. Dev tools support is all essentially peer to peer and ad hoc, with escalation to the internal owner of the tool (another engineer) sometimes possible. If you mess up in an unrecoverable way, IT will give you a loaner to work on while they reimage your machine. It works fine.
As a born and bread corporate (mostly banks) corporate IT guy, I used to frown upon this behavior. Then I got one of my bigger career breaks because the finance team went behind IT, bought a software and installed it in a machine which kept under their desk. They further hired people from an IT service company to configure the machine.
The configuration was so bad that it exposed the company's network to whole wide world. Google contacted the company and after searching high and low IT security managed to track the pc down and take it away. Finance team promised to hire someone with skillset required to run the software in a closed environment. And that's how I ended up getting my job.
This is my current situation. Being a SWE, IT and security are always putting out fires with networks or upper echelon cybersecurity violation complaints (mostly people downloading software without authorization). They have very little time, almost none for investigating new software, and all software must be installed by them. End of the day, nothing gets done on our work computers. I once waited two months for them to say no for a piece of solution we as the team approved. It's absolutely frustrating.
I've been amused by VMWare being on the strictly-enforced official software list, and the VM being considered data. Nothing in the VM counts as software! It's not even being sneaky. Official policy is that the VM is data.
> The CIO admitted that he had been approached and explained that he had informed the VP that IT already had a project with SAP to deliver what the VP needed. “Yes, but that won’t be ready for me to use for three years, and I need something today,” retorted the VP. The CIO was silent. Then the CEO asked the VP, “I’ve known you for ten years. You don’t seem like someone who would do something to harm the company. Why did you do this?” The VP hit right back: “Since I started this digital customer acquisition program, we’ve increased revenue $1M per month. Before we were losing revenue. If you want, I can shut it down right now. What do you want me to do?”
Maybe not for this particular project, but another interpretation of that is "who cares about security if we're making money" which is a very dangerous argument as well.
When a person says "we need this infrastructure project" and a project is commissioned, acknowledging the need, it is in my experience that unfortunately that person's job function is rarely placed on hold until the appropriate infrastructure has been made available.
"Who cares about your pie-in-the-sky infrastructure project, my boss continues to measure our real performance with basic accounting, and is expecting to be able to report on growth each quarter, which I can't help without tools" seems to be a bit closer to the argument posed here, IMHO.
Of course that's the real question with the actual story in the actual article -- could they have implemented their security on the temporary infrastructure, or "good-enough" security, if the CIO knew about it?
In that story, is the blame on the VP for going ahead instead of getting dialogue started between CEO, VP and CIO? Is it on the CIO for just saying "no" instead of recognizing the need and the value? Is it on the CEO for failing to empower the VP and CIO to get that conversation started themselves?
And then, it's all well and good to worry about the bottom line first, until you're sitting in Equifax's shoes right.
I've been the one in the story who hears "no" enough times myself that I know which side I'm naturally going to fall on. But I've never been the one that did an Equifax, and now has to explain themselves to the board, so there's also that.
I've seen it said well in another comment on this post, I feel like could be said about more than a handful of orgs:
> The problem is, I write up a proposal identifying the risks associated with the exemption, along with minimum and recommended compensating controls. This then gets discussed among IT Management, where it is usually decided it's too much overhead, and to just deny the request or if the user can scream loud enough, allow it outright and get some director to sign something. The third oft-used response is ignore the problem and hope the user finds their own work around so we can get back to the 13 projects we're somehow expected to complete this quarter.
> ignore the problem and hope the user finds their own work around
> ignore the problem and hope the user finds their own work around
If this is even remotely the story of what happened, you can't really be surprised when the user went off and did their own thing. If they came to you with a specific priority business problem and an expectation of your support to solve it with a sense of necessary due urgency, and your answer is returned in the format of a 5 year plan... I don't think you can really act surprised in fairness when they end-around you and solve the problem somehow else, anyway.
If it means standing on a mountain of chairs for them to do so then I guess there'd have to be shared culpability. So how do we make sure that it never looks attractive to build that mountain of chairs?
I wish I knew more about the "digital customer acquisition program." The story makes it sound like this "VP for a declining line of business" honestly was not going to make it another 3.5 years without some help.
I struggle with this myself, when it seems like we could go ahead and solve a problem for like $80/mo, but instead we're going to study the problem and spend $20-40k out of peoples' salaries on coming up with a recommendation for an even more expensive project that can only be justified as necessary in order to avoid this other, cheaper tool we could have used.
There's obviously some mismatch when on one hand there's a major project with a vendor like SAP in the picture, but on the other hand there are basic needs that aren't being met, to the point where someone is going to set up "shadow-IT" on a personal credit card just to keep the basic business of the company moving in the right direction.
Yeah, but because each “who cares about money, we’re doing the secure thing” will be naturally outcompeted by the “money over security” guy since money is the measure of success and is the unit that lets you expand. The hard part is rapidly reacting to a realistic threat model for each situation. That’s why good security chiefs are so expensive.
They know when to move that risk control dial in each direction.
Something that's crossed my mind is John Gall's observation that complex systems operate in failure mode 100% of the time. I understand "failure mode" to mean that built-in guards have been bypassed in order to enable the system to do anything at all. Germane to this thread, the "guards" are IT approvals.
I suspect that if a business is complex enough to have IT policy, that policy is always being bypassed in some way, at any given time. Somebody is using unofficial software, or using official software in an unofficial way.
To me it reads like this - VP didn't care about the consequences of utilizing their solution and didn't care about IT; they simply wanted their stuff done, without acknowedgling prioratization of tasks.
The proper way this could have been resolved is by VP utilizing people's skills they've hired. Does this solution look good and will accomplish the task that was prioritized? Excellent! Pass it to IT to evaluate. If the task has specification - excellent, have somebody in IT look for a product that ticks all the boxes and let's choose it together.
I work in an IT organization & I see (in the sense of witness) both sides of this. We are over-tasked and under-resourced and new projects/ideas/initiatives that come in the door go into a backlog of requests. So I see business/end users signing up on their own for SAAS solutions to solve their problems.
> If you don’t think this is happening in your organization, think again
That story probably never happened anyway. But the essence of the article is very true. I never have been in a corp where IT enforces 100% conformity anyway (apart from medical industry).
Sure, there are actual successful attacks, but that is mostly not the fault of unsanctioned programs.
But there are systems where people should not just start to use any system, because information gets lost on the way. That would include CRM and ERP in my opinion. That a company can exist without a CRM is questionable to begin with and solutions are plentiful. If they did not have anything like that...
If the story were true, it would not be the fault of Chief Input/Output.
I've been in corporate IT where this happened. All company apps were built internally. None were able to run on anything past Windows XP. On top of my regular help desk, asset management, software project, and lease refresh program I was also somehow supposed to make the software work with Windows 7 as they had let the developers go. This is the same company that refused my sane security requirements and ignored just about everything until too late. I hear they have since outsourced IT and networking and it's failing dramatically, but they are saving money right?
Well, that cleared that up then! Gosh I had no idea that the solution would be so simple.
It does shock me that the people who've had their whole infrastructure compromised and held to ransom by viruses and the people who've been held over a barrel by suppliers or had vast amounts of money burned by being locked into a dozen vendor contracts for the same service are so silly and hysterical about it when the solution is as simple as "identify when you need to be best in class and stay small everywhere else".
Hehe, if you think this is nuts, come to pharma. We can't do jack shit with our machines. If you so much as change the time on your machine, that is a 'data integrity breach', and if your actions are determined to be malicious it can result in a firing.
Well, all the rigid policies like no dropbox or no FTP or no whatever, also arise from serious concerns. I just wanted to point out another seemingly innocuous one. Most of our equipment is not internet connected, and we need to manually change the time for daylight savings or other corrections. We have a company policy and procedure to do that periodically so that our audit trails are accurate. Sometimes folks get busy and the shop floor guys take matters into their own hands.
A lot of people who post here don't seem to realize that a lot of companies still have IT departments.. a lot of people here are also developers who don't realize they can easily do things to compromise data, even if they think they know better.
My thoughts exactly. It's been a while since I last saw a company with an actual IT department and even longer where they had an actual clue. The reality is that IT in most SMEs simply sucks and no longer requires a college education. Working in IT for an SME is not a career plan. You're at constant risk of being outsourced and essentially all you do is done better by a gazillion companies as a service that probably cost less than a few months of your salary. Frankly, most SMEs would be better off doing exactly that. Most startups I work with do this from day one for obvious reasons.
Once you hit a certain scale different dynamics may kick in but even then, outsourcing is an option.
As a freelancer these days, I bring my own laptop and am granted access to stuff for the duration of the project. It's understood and expected of me that I do such things as encrypt disks, use 2FA, and don't use "secret" as the password. Most stuff I access for these projects is SAAS based. I'd probably walk away from projects where that wasn't the case.
> I would expect people knowing how to use a computer and what they need to do their job.
Yes, but their job generally isn't maintaining the computer or it's supporting infrastructure (networks, shared servers, etc.) or, outside of dedicated programmers, programming the computer even as an incidental task. IT exists to do (or coordinate contracting out for) those things, and the last point—restricting programming to dedicated programmers, has pretty much monotonically increased since the 1980s, when incidental programming was both more common than now and frequently projected to become increasingly common (lots of people said all good jobs would require some.)
When MacOS borks itself in an update because the endpoint management software does something funky with partitions, I'd rather delegate that to a team that's handled three cases of that this week and get on with my job description than diagnose the exact steps to resolve a problem that'll often require access I don't have anyway.
Trust is always contextualized. I trust my mom to watch my kids but not to remove my appendix. The purpose of technical controls and security policy is to wall off areas where employee capabilities or motivations are too uneven or complex to safely expose them without a risk of loss that's incompatible with the appetite of the business.
In our company people just started using free slack en mass, boycotting the horrible IT approved Skype for business. When it was discovered that thousands of employees were using slack, the CTO had to step in and tell IT to fuck off, and started paying for the full version.
> The CIO admitted that he had been approached and explained that he had informed the VP that IT already had a project with SAP to deliver what the VP needed. “Yes, but that won’t be ready for me to use for three years, and I need something today,” retorted the VP. The CIO was silent. Then the CEO asked the VP, “I’ve known you for ten years. You don’t seem like someone who would do something to harm the company. Why did you do this?” The VP hit right back: “Since I started this digital customer acquisition program, we’ve increased revenue $1M per month. Before we were losing revenue. If you want, I can shut it down right now. What do you want me to do?”
Shut it down right now and ask the VP to tender their resignation. Any company doing a 3-year SAP implementation is a very large company. That $1M in additional revenue pales in comparison to the risk introduced by sharing company or personal customer data with a vendor who has not passed the required security auditing. Data is no longer a thing to be thrown around in search of additional revenue and "but I made money" or "I had to because IT is slow" is not a post hoc rationalization for the behavior.
Regardless of the merits of large enterprises acting this way, this is a VP who clearly cannot function within the enhanced risk-controlled environment of one and should find a position with a smaller company where they have more freedom to pursue personal initiatives at the VP-level. Those companies exist. Go find one.
That's fair, but just using the time-frames in the article, 6 months had passed before the VP got caught and still had 3 years left on the implementation. I think a director of technology who take 3.5+ years to do a CRM implementation at a 100-person company... isn't doing a very good job. :-)