Hacker Newsnew | past | comments | ask | show | jobs | submit | ShawnJG's commentslogin

if you dig deep on this site you will find examples of written applications and video submissions that were successful. That should give you an idea, just an idea mind you of what YC is looking for. While it seems you're coming up with a multitude of ideas for your project I will concentrate on producing some limited scope of what you have to show. This is not a Hollywood edition so razzle dazzle will not get you far. Substance is the key. If your idea was well articulated and got your point across, PG and the rest of the team will be able to evaluate application properly. You have until October 10, so look over your application and be concise and direct. if you've already accomplished that way and hope for the best, if not edit your application and resubmit. Good luck excavation!


here are two tips go online to your county clerk's office they will have a list of newly formed businesses. The listing usually includes name address and telephone number. It should be easy to get those people to listen to you as they are usually in a process of just getting started and I need of design services. Second you can either actively or passively look for old dated looking websites. Identify outdated designs and contact the company with one or two suggestions for improvement and offered to do it for them. The e-mail should include a link of your previous work.


not bad for a 48 hour build. A little attention to formatting and graphical aesthetics and I can see your site easily be used by some blogger. the most important thing you've already done, you major site valuable. It's a quick and easy way to look up caloric information about very popular foods. you make something people want it easy for the layman to understand and use. He followed the first rule of getting something to market fast now all you have to do is make incremental improvements based on user feedback.


I have to completely disagree with this author. Even if you're confident in your code, send your best people to work on it, are able to rollout a phased deployment and have the capability for quick rollback in overnight or off-peak deployment is preferable. If ever there was anything to happen unless people affected by it the better. Why put the strain of peak traffic and bandwidth up against a brand-new deployment?


You make a good point. So does the article.

What I was wondering while reading the article: when will it become common that there is no peak or off-peak time? There are some businesses there now, but at some point there will be no such thing for most businesses.


That's a good point that I honestly didn't think to bring up in the article directly--thought I did mention that for businesses operating globally, the notion of "overnight" becomes meaningless.

I think the first commenter took the notion a bit far; I didn't mean to imply that we should be deploying during your absolute busiest hours. However, I think deploying overnight is a bad idea--and, moreover, I think it's dangerous to assume that simply "avoiding your customers" is a scalable, long-term solution for deployment.


you're right, "avoiding your customers" is not a scalable long-term solution for deployment. As businesses, big or small become more global this will become a bigger problem. As I'm sitting here reading this I see a potential for new business. We need to find some way to filter traffic. Think about two different versions of the website existing at the same address and in front of the website sits a fork (like a fork in the road) that splits incoming traffic into two groups. As traffic comes in a portion, say 20% is then the diverted to your brand-new deployment while the rest, 80% is directed to your older site. It's almost like having an semi-open beta in regards to the new deployment. It will be tested in real time under real circumstances. Then as bugs get ironed out you can increase the ratio from 80 – 20 gradually until all traffic is now being fed into the brand-new live site. so who wants to co-found this with me? we might be able to make it in time for the next round of Y Combinator!

by the way I didn't think you were saying that you should deploy during your busiest time, I just met all things being equal, the should definitely look for "off-peak" times to rollout new deployments. But we both seem to agree that the long-term solution. Globalization is going to make off-peak times a thing of the past. Hence see my solution above, we can get ahead of it and make some money.


Not sure there's a ton of money to be made; solutions like what you're describing already exist and are in use at mature development shops that understand that they don't have to be afraid to roll out new code at 9:30 am.


while postponing your IPO may seem like a good idea, technology companies need to be aware of the fact that the industry changes so fast that competitors could emerge and given such fierce competition that their earnings outlook could start to look bleak at worst or make them obsolete thereby devaluing the whole company. What happened to myspace should be a cautionary tale to everyone.


I would first like to preface this by saying that I think insider trading is wrong. And individuals who blatantly and obviously take advantage of privileged private information for profit should be punished. However for most cases I find that the punishment far outweighs the crime. When bankers and CEOs on Wall Street manipulate the market for their own gains or cause industry crashes based on reckless behavior and ordinary people lose billions of dollars in net worth the Wall Street hotshot get to walk away Scott free. In my opinion I rather have a guy who purchased options on a tip in making money versus reckless money managers who lost his senior citizens entire life savings. The mortgage crisis was completely avoidable and most people agree that what companies did bordered on the illegal yet no one has been prosecuted for crime. These high-profile prosecutions by the SEC seemed to be just a bit of posturing to get the image polished in the news.

Martha Stewart and Mark Cuban for both in the same boat, bad knowledge that their investment was about to tank in there were about to lose money. What normal person would sit back and allow their money to go down the drain. Especially when information is not solicited by you. In both cases it seems someone came to them with the information they did not actively seek it out. How could you not act on that tip. I'm not crying for multimillionaires but it's hard to ignore the fact that any normal Joe blow citizen would do any differently. this is exactly the case with the factory workers. The chief mechanical executive was observant of his surroundings and concluded something was up. He was not involved in the deal, no one came to him and give him information, he pieced the puzzle together himself. He should not be penalized for that.

The problem is the vagueness of the law. Laws should not be left open for wide speculation and interpretation of said law. Especially in finance, it should be as defined in particular as possible. We definitely should have fair markets, without it Wall Street would evolve into an even more chaotic and corrupt place. The people need to know where they stand and what they can and cannot do.

This type of disparity in the law and its enforcement is not unique to the financial industry. Drug laws and penalties vary wildly. The Rockefeller drug laws are some of the most notoriously lopsided legislation that was ever passed. You could do decades for possession of crack but only a few years of probation for the equivalent amount of cocaine.

Define the laws, level the playing field and enforce penalties uniformly. It's really just a simple as that.

On a funny note the guys who used a Croatian seamstress at the front first tried to hire strippers to solicit tips from bankers while they were partying at a strip club. Illegal, yes but funny as hell.


Great start! You had an idea and was motivated enough to follow through. I wish you the best and hope your app is a big success. I'm going to go take a look at it now in support a fellow HN message board member.


and here is the second repost bynextparadigms...

Make it even more decentralized and more P2P based. Eliminate as many points of failure as possible. Use something like the Phantom Protocol for increased freedom and security as well:

http://code.google.com/p/phantom/


I posted this topic super early in the morning and it got pushed down before a lot of people could see it. I did manage to get two really good comments before that happened. I'm going to repost them here including the usernames of the people who posted them.

Originally posted by sandroyong

I would replace the server - I mean, do we really need servers? In short, the internet and networks in general were not designed with security in mind. (I would like to apologize in advance for the lengthy response but I want to make myself clear)

The Client-Server model is the most widely adopted model of networking. As the basis of the internet, it is also most difficult to depose. Even in its most basic form, it precludes security and rejects attempts to secure interacting systems for the following reasons: 1) The server is just that - a slave that serves its masters, the clients. Despite security measures to limit and control client access, the server must at least: a) Listen for client requests - clients must be able to locate, and thus can target servers. b) Attempt to interpret, then determine whether to grant or deny the request - performing redundant (permissions are decided a-priori) and risky work in its own environment. 2) Clients need servers throughout their entire presence on the network, so servers remain open to attack throughout a client’s session. 3) Servers have access to all resources and to other clients’ data during any session with an individual client. 4) Exploiting a server confers the ability to exploit all of its clients and all resources.

The mode of implementation of networks introduces even more insecurities: 1) The system relies on explicit security-related information - such information can be falsified, thus the system cannot support non-repudiation. 2) The system transports clients to any destination they specify - it is up to the destination to defend itself against unwanted guests. 3 )Any client or server on the network can be discovered by any other client or server - the network can be searched systematically to find and exploit vulnerable targets.

In short, the server is and will continue to be a target for hackers. However, if the target (server) was removed, would we have security breaches? Probably not, but more importantly for all users, we would not have an internet as we know it. Therefore, we need a medium that accomplishes both - a hardware element that allows for network communications (but without the insecure handles that make up a server) as well as allowing us to make up the network we now call the internet. Pure fantasy or are we just too engrained with what we have to be able to ‘think outside the box’?

Let’s look at why? There have been admirable strides towards making today’s systems more secure. There is also significant and proactive efforts focused on finding vulnerabilities and developing patches to fix them before they can be exploited. Although security measures exist, none are truly pre-emptive. In my view, everyone is just making variations of the same thing and, more importantly, we’re just ‘barking up the wrong tree’! Current defense strategies follow three common underlying themes: 1) most tend to focus on particular attackers, attacks, or attack methodologies; 2) many aim to defend particular targets or groups of targets; 3) all are confined and subverted by the existing framework for computing and networks. In its entirety and more importantly, current defensive strategies cannot 1) prevent most unknown attacks; 2) make targets unavailable for exploitation, nor 3) compensate for design flaws in the system being defended. Therefore, would it not be more instructive to examine what enables the initiation of attacks? i.e., what are the handles that allow the system to be breached?

As I eluded to above, the interaction between client and server software/systems, on which (often flawed) software is based, presents too many handles for misuse and abuse. Therefore, this argument points to one and only one common denominator - the network is inherently insecure by design, i.e., this is the problem that security people should be addressing, not a new variant-kind-of patch. So, if I could travel back to the 1990’s to design the networks/internet (as you have suggested) and knowing what I know now, what and how would I design it? More importantly, can it be redesigned today? Or should security product developers stay content and conform to the present hardware and software platforms and just develop ‘patches’? Even if software could be made flawless in logic, it may not be feasible to prevent the misuse of software. Don’t forget the human element - that in itself is the weakest ‘link’ in network security.

We are therefore left with redesigning the computer and network environments to allow flawed software (and people) to operate securely and render such software (and people) inoffensive should their flaws (and actions) be targeted (be used) for exploitation. This implies that the network (and elements of the network) be fault-tolerant or, more to the point, secure by design. So, we can have the benefits of a network and the protection from network security into one all encompassing computer-network infrastructure.

My suggestion: It should be conceivable and possible to map and mete out resources to clients/users as they are accepted onto the network. This is the basis of client completion; for each user/client, a host environment is created in which all allowable services and resources are locally available and locally supported. We already pre-determine the "stuff" the user needs/requires to have access to on the server - so why not bypass the server? Since access to and management of these resources is local, there is no need, or means by which, to interact with the server or the network; the client is thus complete - a discrete entity on the network. The containment of this discrete component must be as complete as possible to ensure that there is no "leakage" to other client or server environments if the user exploits any vulnerability in the client.

I can see a distinct and separate internet from the current one and more businesses and services coming out from this model.

OK, I've said enough...hopefully that will spur more discussions on this subject =)


On June 6, 2015 at 3:23 PM Google became self-aware and immediately destroyed Skynet. Very interesting to see how Google works. They could very easily become evil. Here's hoping they stay on the straight and narrow. Its strength lies in its multipronged positive feedback loop. But while it may be easier to thwart incremental advances from competing companies as was the case in the GPS kerfuffle, unlike with most industries a giant leap forward overnight can happen in the Internet age. That would easily put them in a negative feedback loop which they may not be able to recover from. That's what's best about the Internet, there is very little distance between a good idea and implementation to consumption by the masses. The roadside is littered with previously high-value tech companies who were devalued overnight by a competing idea.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: