It's hardly a secret tesla.com is Drupal -- both that gitignore and the robots.txt shouts it quite loudly, to be fair. One of the larger Drupal agencies, Lullabot includes them in their clients list: https://www.lullabot.com/our-work and they are looking for a sr backend Drupal engineer https://www.tesla.com/careers/search/job/sr-software-enginee... which I would take if the company were not lead by Musk.
Probably you have, lots of websites still using Drupal, heavily customised of course. Search for "websites made with Drupal" and have your jaw dropped, as probably a website or two you visited recently will show up :)
Ship the files sure, ship the top-level folder not really. Most sites will have a "public" subfolder or equivalent, so the READMEs, scripts, sources etc don't get served. Either way, a professional would remove those files or block them at the HTTP server level.
Well, I mean, that might have been somewhat standard practice with mod_python the last time anyone used mod_python, which I would assume was about 15 years ago.
So the unusual thing about PHP (and, historically, mod_python, and CGI for that matter) is that it's normal to have the code actually under the web server's content directory tree. That is, if your content root is /var/www, then you put your code at /var/www/thing.php, so a deployment involves copying stuff into /var/www, and if the server is misconfigured, going to "https://example.com/thing.php" will actually show you the code.
For, say, Java or Ruby web apps, your code is more likely to live elsewhere (people love to fight over exactly _where_...), and run its own web server; nginx or apache or whatever will then proxy requests to that webserver. No matter how it's configured, you're never going to show the end-user the code, or extraneous files like .gitignore. Python's a bit of a corner-case (or at least it used to be last time I worked with Python webapps about a decade ago); it's customary to use WSGI or similar rather than a proper web server, but the effect is much the same.
That is very much not standard practice for PHP since about 10 years by now. Applications have a designated web root directory and an entry point that boots the application - as php is serverless by design - which is sometimes placed inside the web root by convention, but that is neither a requirement nor a security risk.
By now, stateful application servers are also powering modern php deployments: They also listen to a socket, and keep parts of the application in memory, next to an event loop.
Generally very much like Rails with a running process that could be standalone or with a web server interface module depending traffic or other requirements.
I don’t know, my servers are generally configured to serve nothing more than the index.php file and anything in the /public directory. I don’t serve up the entire content repository.
Things don't have to be confidential to be an issue. Leaking the actual maintainer's names (as opposed to the Drupal list), for instance, would not necessarily be considered confidential, but still an issue if it showed up.
Yeah this screams complete and utter desperation. Like, I get that hating Elon is what all the cool kids at school are doing this month but do we really need this immature garbage on the front page of HN all day?
Yep, it seems like most of the posters here in this thread don’t do much software engineering from the looks of it. Or are being purposely obtuse here. There is no security vulnerability here in any of the links we’ve seen so far minus some unnecessarily deployed boilerplate. The gitignore file is not the same file your deployment tool uses when publishing a website. If there’s an API endpoint that is public opposed to some static asset, that would be a problem. Nothing we’ve seen here indicates that.
Honestly, all the Twitter acquisition has shown is how irrelevant to Twitter’s success the management team was. Twitter has gone from a sophisticated, large organization with 8000 employees to 700 guys following the direction of a random guy making crucial business decisions off Twitter polls (lmao), and if anything it’s become slightly more popular and successful
"more successful" how? Because Elon said so? If it's just about raw page impressions/activity, then perhaps for while since most of the western press is reporting and often directly linking to Twitter right now but what will happen once the media and their audience is bored of the drama and jumps to the next fad?
Do you really believe Twitter will become more profitable under Musk than before when even the new CEO already prepped the workers for a possible bankruptcy, a fat pending debt repayment date coming closer and advertisers running away?
> if anything it’s become slightly more popular and successful
Twitter made $5bil in 2021. Do you really think this or the next quarter, post-Musk acquisition, post-him running off big name advertisers, will even approach any of the worst quarters from the last 3 or 4 years under previous management?
He has all the data. We know for certain Musk would be shouting from the rooftops if that brief burst of Twitter Blue subs made any real dent in revenue.
Well, I'd personally at least find some hilarity in being a Twitter engineer fired by one of those 10x Tesla engineers while they're publishing their .gitignore files via HTTPS (which probably means that their Nginx configuration is fucked).
This is not an issue and just means that their wwwroot probably comes from a repo. Anyone who judges an engineer who made this decision poorly is silly.
I’d say it’s closer to good thing than bad thing due to simplicity.
Not parent you are answering to, and I don't have a dog in this "elon is a god"/"elon is the devil" fight, but let's stay factual: while the .gitignore is not an issue at all, serving dot files should virtually never be done
>I’d say it’s closer to good thing than bad thing due to simplicity.
Unless they intended to publish their .gitignore, I'd say it's closer to a bad thing than to a good thing to have random files from your repository open to the public.
The simplest S3 permissions is to allow "*" publically too, but simple doesn't make it better.
It's barely a vulnerability. Many open source projects have theirs public. It might be a problem if the company's system was terrible and relied on security through obscurity; but maybe they don't care. The engineers who think it's a big deal may have tunnel vision. That can happen if you spend years in a very narrow area.
It's standard practice not to serve any hidden files (starting with .) over HTTP. The fact that .gitignore is served can indicate they don't block .paths, so lots of other things could slip through (.aws for instance).
It has always been standard, it was the #1 thing to do when setting up Apache back when Apache was the standard and nginx was still this obscure Russian porn web server.
.well-known is much more recent and an exception. Can you think of any other .file or .folder which is wise to be exposed publicly?
I was around back then and uploading websites, (version controlling on svn, not git), and I do not recall it being a standard. The closest standard I can think of is .htaccess files (which we did upload) for various vhost specific settings.
What is your basis for this standard? Was there a mailing list agreement I missed?
Yes, it's meant to be public, but you need not disclose all of what is contained inside of it. I've been on many pentests where paths provided by robots.txt, that I wouldn't have obtained any other way, led to exploitable vulnerabilities.
For some reason, a considerable number of people don't seem to think twice about adding sensitive paths to robots.
I would rather that the paths be secure themselves. Security by obscurity is not a good idea. Anyways there are not that combinations of paths even when you consider all the different cms defaults
You're correct that the resources themselves should be secured and that security through obscurity is a bad practice (and an oxymoron, as obscurity doesn't actually provide security).
That said, avoiding security through obscurity doesn't preclude you from giving away less information than is being given away here, nor does it make the act of removing that information entirely pointless. While this isn't the only way that the Drupal version can be identified, it is one, and there's no guarantee your adversary will find it via other avenues. Also keep in mind that with absolutely nothing changing on Tesla's end, this may go from secure to vulnerable, should, for instance, a remotely exploitable vulnerability in the running version of Drupal be discovered and published in the future.
Well, we don't really know. Maybe there's some easy-to-guess text file in /misc/ that contains a password for something. We don't know what we don't know. We do know that there's considerably more information exposed here than zero - the question is whether any of that information could lead to sensitive information, not whether or not it constitutes sensitive information by itself.