Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.
Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and learn more about it. Thanks for reading, and for making the site better!
First time accepted submitter Mike Sheen writes I'm the lead developer for an Australian ERP software outfit. For the last 10 years or so we've been using Bugzilla as our issue tracking system. I made this publicly available to the degree than anyone could search and view bugs. Our software is designed to be extensible and as such we have a number of 3rd party developers making customization and integrating with our core product.
We've been pumping out builds and publishing them as "Development Stream (Experimental / Unstable" and "Release Stream (Stable)", and this is visible on our support site to all. We had been also providing a link next to each build with the text showing the number of bugs fixed and the number of enhancements introduced, and the URL would take them to the Bugzilla list of issues for that milestone which were of type bug or enhancement.
This had been appreciated by our support and developer community, as they can readily see what issues are addressed and what new features have been introduced. Prior to us exposing our Bugzilla database publicly we produced a sanitized list of changes — which was time consuming to produce and I decided was unnecessary given we could just expose the "truth" with simple links to the Bugzilla search related to that milestone.
The sales and marketing team didn't like this. Their argument is that competitors use this against us to paint us as producers of buggy software. I argue that transparency is good, and beneficial — and whilst our competitors don't publish such information — but if we were to follow our competitors practices we simply follow them in the race to the bottom in terms of software quality and opaqueness.
In my opinion, transparency of software issues provides:
Identification of which release or build a certain issue is fixed.
Recognition that we are actively developing the software.
Incentive to improve quality controls as our "dirty laundry" is on display.
Information critical to 3rd party developers.
A projection of integrity and honesty.
I've yielded to the sales and marketing demands such that we no longer display the links next to each build for fixes and enhancements, and now publish "Development Stream (Experimental / Unstable" as simply "Development Stream") but I know what is coming next — a request to no longer make our Bugzilla database publicly accessible. I still have the Bugzilla database publicly exposed, but there is now only no longer the "click this link to see what we did in this build".
A compromise may be to make the Bugzilla database only visible to vetted resellers and developers — but I'm resistant to making a closed "exclusive" culture. I value transparency and recognize the benefits. The sales team are insistent that exposing such detail is a bad thing for sales.
I know by posting in a community like Slashdot that I'm going to get a lot of support for my views, but I'm also interested in what people think about the viewpoint that such transparency could be bad thing.
158 comments | 3 days ago
New submitter RaDag writes: PostgreSQL outperformed MongoDB, the leading document database and NoSQL-only solution provider, on larger workloads than initial performance benchmarks. Performance benchmarks conducted by EnterpriseDB, which released the framework for public scrutiny on GitHub, showed PostgreSQL outperformed MongoDB in selecting, loading and inserting complex document data in key workloads involving 50 million records. This gives developers the freedom to combine structured and unstructured data in a single database with ACID compliance and relational capabilities.
147 comments | 5 days ago
The recently disclosed bug in bash was bad enough as a theoretical exploit; now, reports Ars Technica, it could already be being used to launch real attacks. In a blog post yesterday, Robert Graham of Errata Security noted that someone is already using a massive Internet scan to locate vulnerable servers for attack. In a brief scan, he found over 3,000 servers that were vulnerable "just on port 80"—the Internet Protocol port used for normal Web Hypertext Transfer Protocol (HTTP) requests. And his scan broke after a short period, meaning that there could be vast numbers of other servers vulnerable. A Google search by Ars using advanced search parameters yielded over two billion web pages that at least partially fit the profile for the Shellshock exploit. More bad news: "[T]he initial fix for the issue still left Bash vulnerable to attack, according to a new US CERT National Vulnerability Database entry." And CNET is not the only one to say that Shellshock, which can affect Macs running OS X as well as Linux and Unix systems, could be worse than Heartbleed.
317 comments | about a week ago
HughPickens.com writes Ernesto reports at TorrentFreak that despite its massive presence the Pirate Bay doesn't have a giant server park but operates from the cloud, on virtual machines that can be quickly moved if needed. The site uses 21 "virtual machines" (VMs) hosted at different providers, up four machines from two years ago, in part due to the steady increase in traffic. Eight of the VMs are used for serving the web pages, searches take up another six machines, and the site's database currently runs on two VMs. The remaining five virtual machines are used for load balancing, statistics, the proxy site on port 80, torrent storage and for the controller. In total the VMs use 182 GB of RAM and 94 CPU cores. The total storage capacity is 620 GB. One interesting aspect of The Pirate Bay is that all virtual machines are hosted with commercial cloud hosting providers, who have no clue that The Pirate Bay is among their customers. "Moving to the cloud lets TPB move from country to country, crossing borders seamlessly without downtime. All the servers don't even have to be hosted with the same provider, or even on the same continent." All traffic goes through the load balancer, which masks what the other VMs are doing. This also means that none of the IP-addresses of the cloud hosting providers are publicly linked to TPB. For now, the most vulnerable spot appears to be the site's domain. Just last year TPB burnt through five separate domain names due to takedown threats from registrars. But then again, this doesn't appear to be much of a concern for TPB as the operators have dozens of alternative domain names standing by.
144 comments | about two weeks ago
Advocatus Diaboli writes:
According to a report from Gizmodo, "After six years and over one billion dollars in development, the FBI has just announced that its new biometric facial recognition software system is finally complete. Meaning that, starting soon, photos of tens of millions of U.S. citizen's faces will be captured by the national system on a daily basis. The Next Generation Identification (NGI) program will logs all of those faces, and will reference them against its growing database in the event of a crime. It's not just faces, though. Thanks to the shared database dubbed the Interstate Photo System (IPS), everything from tattoos to scars to a person's irises could be enough to secure an ID. What's more, the FBI is estimating that NGI will include as many as 52 million individual faces by next year, collecting identified faces from mug shots and some job applications." Techdirt points out that an assessment of how this system affects privacy was supposed to have preceded the actual rollout. Unfortunately, that assessment is nowhere to be found.
Two recent news items are related. First, at a music festival in Boston last year, face recognition software was tested on festival-goers. Boston police denied involvement, but were seen using the software, and much of the data was carelessly made available online. Second, both Ford and GM are working on bringing face recognition software to cars. It's intended for safety and security — it can act as authentication and to make sure the driver is paying attention to the road.
129 comments | about two weeks ago
An anonymous reader sends this news from El Reg: The U.K.'s National Health Service has ripped the Oracle backbone from a national patient database system and inserted NoSQL running on an open-source stack. Spine2 has gone live following successful redevelopment including redeployment on new, x86 hardware. The project to replace Spine1 had been running for three years with Spine2 now undergoing a 45-day monitoring period. Spine is the NHS’s main secure patient database and messaging platform, spanning a vast estate of blades and SANs. It logs the non-clinical information on 80 million people in Britain – holding data on everything from prescriptions and payments to allergies. Spine is also a messaging hub, serving electronic communications between 20,000 applications that include the Electronic Prescription Service and Summary Care Record. It processes more than 500 complex messages a second.
198 comments | about three weeks ago
An anonymous reader writes Coursera, the online education platform with over 9 million students, appears to have some serious privacy shortcomings. According to one of Stanford's instructors, 'any teacher can dump the entire user database, including over nine million names and email addresses.' Also, 'if you are logged into your Coursera account, any website that you visit can list your course enrollments.' The attack even has a working proof of concept [note: requires Coursera account]. A week after the problems were reported, Coursera still hasn't fixed them.
31 comments | about a month ago
An anonymous reader writes: The second of two lawsuits filed against the U.S. government regarding domestic mass surveillance, ACLU vs. Clapper, was heard on Tuesday by "a three-judge panel on the U.S. Court of Appeals for the 2nd Circuit." The proceeding took an unprecedented two hours (the norm is about 30 minutes), and C-SPAN was allowed to record the whole thing and make the footage available online (video). ACLU's lawyers argued that mass surveillance without warrants violates the 4th Amendment, while lawyers for the federal government argued that provisions within the Patriot Act that legalize mass surveillance without warrants have already been carefully considered and approved by all three branches of government. The judges have yet to issue their ruling.
199 comments | about a month ago
An anonymous reader writes: On August 6, U.S. District Judge Anthony Trenga ordered the federal government to "explain why the government places U.S. citizens who haven't been convicted of any violent crimes on its no-fly database." Unsurprisingly, the federal government objected to the order, once more claiming that to divulge their no-fly list criteria would expose state secrets and thus pose a national security threat. When the judge said he would read the material privately, the government insisted that reading the material "would not assist the Court in deciding the pending Motion to Dismiss (PDF) because it is not an appropriate means to test the scope of the assertion of the State Secrets privilege." The federal government has until September 7 to comply with the judge's order unless the judge is swayed by the government's objection.
248 comments | about a month ago
snydeq writes: Developers are embracing a range of open source technologies, writes Matt Asay, virtually none of which are supported or sold by Red Hat, the purported open source leader. "Ask a CIO her choice to run mission-critical workloads, and her answer is a near immediate 'Red Hat.' Ask her developers what they prefer, however, and it's Ubuntu. Outside the operating system, according to AngelList data compiled by Leo Polovets, these developers go with MySQL, MongoDB, or PostgreSQL for their database; Chef or Puppet for configuration; and ElasticSearch or Solr for search. None of this technology is developed by Red Hat. Yet all of this technology is what the next generation of developers is using to build modern applications. Given that developers are the new kingmakers, Red Hat needs to get out in front of the developer freight train if it wants to remain relevant for the next 20 years, much less the next two."
232 comments | about a month ago
An anonymous reader writes Following up on a recent experiment into the status of software engineers versus managers, Jon Evans writes that the easiest way to find out which companies don't respect their engineers is to learn which companies simply don't understand them. "Engineers are treated as less-than-equal because we are often viewed as idiot savants. We may speak the magic language of machines, the thinking goes, but we aren't business people, so we aren't qualified to make the most important decisions. ... Whereas in fact any engineer worth her salt will tell you that she makes business decisions daily–albeit on the micro not macro level–because she has to in order to get the job done. Exactly how long should this database field be? And of what datatype? How and where should it be validated? How do we handle all of the edge cases? These are in fact business decisions, and we make them, because we're at the proverbial coal face, and it would take forever to run every single one of them by the product people and sometimes they wouldn't even understand the technical factors involved. ... It might have made some sense to treat them as separate-but-slightly-inferior when technology was not at the heart of almost every business, but not any more."
371 comments | about a month and a half ago
Yesterday we ran Part One of this two-part video. This is part two. To recap yesterday's text introduction: Detroit recently hosted the North American Science Fiction Convention, drawing thousands of SF fans to see and hear a variety of talks on all sorts of topics. One of the biggest panels featured a discussion on perhaps the greatest technological disappointment of the past fifty years: Where are our d@%& flying cars? Panelists included author and database consultant Jonathan Stars, expert in Aeronautical Management and 20-year veteran of the Air Force Douglas Johnson, author and founder of the Artemis Project Ian Randal Strock, novelist Cindy A. Matthews, Fermilab physicist Bill Higgins, general manager of a nanotechnology company Dr. Charles Dezelah, and astrobiology expert Dr. Nicolle Zellner. As it turns out, the reality of situation is far less enticing than the dream -- but new technologies offer a glimmer of hope. (Alternate Video Link)
66 comments | about a month and a half ago
mrspoonsi sends this BBC report: "A U.S. juggler facing child sex abuse charges, who jumped bail 14 years ago, has been arrested in Nepal after the use of facial-recognition technology. Street performer Neil Stammer traveled to Nepal eight years ago using a fake passport under the name Kevin Hodges. New facial-recognition software matched his passport picture with a wanted poster the FBI released in January. Mr Stammer, who had owned a magic shop in New Mexico, has now been returned to the U.S. state to face trial. The Diplomatic Security Service, which protects U.S. embassies and checks the validity of U.S. visas and passports, had been using FBI wanted posters to test the facial-recognition software, designed to uncover passport fraud. The FBI has been developing its own facial-recognition database as part of the bureau's Next Generation Identification program."
232 comments | about a month and a half ago
Detroit recently hosted the North American Science Fiction Convention, drawing thousands of SF fans to see and hear a variety of talks on all sorts of topics. One of the biggest panels featured a discussion on perhaps the greatest technological disappointment of the past fifty years: Where are our d@%& flying cars? Panelists included author and database consultant Jonathan Stars, expert in Aeronautical Management and 20-year veteran of the Air Force Douglas Johnson, author and founder of the Artemis Project Ian Randal Strock, novelist Cindy A. Matthews, Fermilab physicist Bill Higgins, general manager of a nanotechnology company Dr. Charles Dezelah, and astrobiology expert Dr. Nicolle Zellner. This video and the one you'll see tomorrow show their lively discussion about the economic, social, and political barriers to development and adoption of affordable flying cars. (Alternate Video Link)
107 comments | about a month and a half ago
msm1267 (2804139) writes "Researcher David Litchfield is back at it again, dissecting Oracle software looking for critical bugs. At the Black Hat 2014 conference, Litchfield delivered research on a new data redaction service the company added in Oracle 12c. The service is designed to allow administrators to mask sensitive data, such as credit card numbers or health information, during certain operations. But when Litchfield took a close look he found a slew of trivially exploitable vulnerabilities that bypass the data redaction service and trick the system into returning data that should be masked."
62 comments | about 2 months ago
itwbennett writes Some security researchers on Wednesday said it's still unclear just how serious Hold Security's discovery of a massive database of stolen credentials really is. "The only way we can know if this is a big deal is if we know what the information is and where it came from," said Chester Wisniewski, a senior security advisor at Sophos. "But I can't answer that because the people who disclosed this decided they want to make money off of this. There's no way for others to verify." Wisniewski was referring to an offer by Hold Security to notify website operators if they were affected, but only if they sign up for its breach notification service, which starts at $120 per year.
102 comments | about 2 months ago
Advocatus Diaboli (1627651) writes with the chilling, but not really surprising, news that the U.S. government is aware that many names in its terrorist suspect database are not linked to terrorism in any way. From the article: Nearly half of the people on the U.S. government's widely shared database of terrorist suspects are not connected to any known terrorist group, according to classified government documents obtained by The Intercept. Of the 680,000 people caught up in the government's Terrorist Screening Database — a watchlist of "known or suspected terrorists" that is shared with local law enforcement agencies, private contractors, and foreign governments — more than 40 percent are described by the government as having "no recognized terrorist group affiliation." That category — 280,000 people — dwarfs the number of watchlisted people suspected of ties to al Qaeda, Hamas, and Hezbollah combined.
256 comments | about 2 months ago
wiredmikey writes Mozilla warned on Friday that it had mistakenly exposed information on almost 80,000 members of its Mozilla Developer Network (MDN) as a result of a botched data sanitization process. The discovery was made around June 22 by one of Mozilla's Web developers, Stormy Peters, Director of Developer Relations at Mozilla, said in a security advisory posted to the Mozilla Security Blog on Friday. "Starting on about June 23, for a period of 30 days, a data sanitization process of the Mozilla Developer Network (MDN) site database had been failing, resulting in the accidental disclosure of MDN email addresses of about 76,000 users and encrypted passwords of about 4,000 users on a publicly accessible server," Peters wrote. According to Peters, the encrypted passwords were salted hashes and they by themselves cannot currently be used to authenticate with the MDN. However, Peters warned that MDN users may be at risk if they reused their original MDN passwords on other non-Mozilla websites or authentication systems.
80 comments | about 2 months ago
linuxwrangler (582055) writes Job interviews missed, work and wedding plans disrupted, children unable to fly home with their adoptive parents. All this disruption is due to a outage involving the passport and visa processing database at the U.S. State Department. The problems have been ongoing since July 19 and the best estimate for repair is "soon." The system "crashed shortly after maintenance."
162 comments | about 2 months ago
New submitter yeshuawatso writes I work for one of the largest HVAC manufacturers in the world. We've currently spent millions of dollars investing in an ERP system from Oracle (via a third-party implementor and distributor) that handles most of our global operations, but it's been a great ordeal getting the thing to work for us across SBUs and even departments without having to constantly go back to the third-party, whom have their hands out asking for more money. What we've also discovered is that the ERP system is being used for inputting and retrieving data but not for managing the data. Managing the data is being handled by systems of spreadsheets and access databases wrought with macros to turn them into functional applications. I'm asking you wise and experienced readers on your take if it's a better idea to continue to hire our third-party to convert these applications into the ERP system or hire internal developers to convert these applications to more scalable and practical applications that interface with the ERP (via API of choice)? We have a ton of spare capacity in data centers that formerly housed mainframes and local servers that now mostly run local Exchange and domain servers. We've consolidated these data centers into our co-location in Atlanta but the old data centers are still running, just empty. We definitely have the space to run commodity servers for an OpenStack, Eucalyptus, or some other private/hybrid cloud solution, but would this be counter productive to the goal of standardizing processes. Our CIO wants to dump everything into the ERP (creating a single point of failure to me) but our accountants are having a tough time chewing the additional costs of re-doing every departmental application. What are your experiences with such implementations?
209 comments | about 2 months ago