recentpopularlog in

kme : career   22

Kernighan's lever
Everyone knows that debugging is twice as hard as writing a program in the first place. So if you're as clever as you can be when you write it, how will you ever debug it?

Pay close attention to what is actually being said: Having written code as cleverly as you can, you will suddenly face a problem that you are not clever enough to solve.

Kernighan's witty remarks provide a clue: In programming, as soon as you work at your current level, you will automatically end up in a situation where you have to work beyond your current level. By means of this very fortunate mechanism, you will leverage several basic human drives (honour, pride, stubbornness, curiosity) into providing the motivation necessary for improvement.

I call this mechanism Kernighan's lever. By putting in a small amount of motivation towards the short-term goal of implementing some functionality, you suddenly end up with a much larger amount of motivation towards a long term investment in your own personal growth as a programmer.
devel  software  debugging  advice  career  growth  kernighan 
september 2019 by kme
Julio Biason .Net 4.0 - Things I Learnt The Hard Way (in 30 Years of Software Development)
"A language that doesn't affect the way you think about programming, is not worth knowing." -- Alan Perlis
antipatterns  coding  programming  career  advice  tipsandtricks 
june 2019 by kme
The Utter Uselessness of Job Interviews - The New York Times
The key psychological insight here is that people have no trouble turning any information into a coherent narrative. This is true when, as in the case of my friend, the information (i.e., her tardiness) is incorrect. And this is true, as in our experiments, when the information is random. People can’t help seeing signals, even in noise.

There was a final twist in our experiment. We explained what we had done, and what our findings were, to another group of student subjects. Then we asked them to rank the information they would like to have when making a G.P.A. prediction: honest interviews, random interviews, or no interviews at all. They most often ranked no interview last. In other words, a majority felt they would rather base their predictions on an interview they knew to be random than to have to base their predictions on background information alone.

So great is people’s confidence in their ability to glean valuable information from a face to face conversation that they feel they can do so even if they know they are not being dealt with squarely. But they are wrong.
theinterview  employment  career  business 
april 2017 by kme
Years of irrelevance – Signal v. Noise
Which leads me to my point: Requiring X years of experience on platform Y in your job posting is, well, ignorant. As long as applicants have 6 months to a year of experience, consider it a moot point for comparison. Focus on other things instead that’ll make much more of a difference. Platform experience is merely a baseline, not a differentiator of real importance.

In turn that means you as an applicant can use requirements like “3-5 years doing this technology” as a gauge of how clued-in the company hiring is. The higher their requirements for years of service in a given technology, the more likely that they’re looking for all the wrong things in their applicants, and thus likely that the rest of the team will be stooges picked for the wrong reasons.
theinterview  career  jobsearch  employment 
january 2017 by kme
The Reality of Developer Burnout — Kenneth Reitz
It happens to everyone that writes code all day long — the sudden feeling of "I'd rather do anything else than this right now" — even though writing software is one of your favorite activities in the world.

You suddenly realize that you've been eating ice cream for three meals every day for years on end. You're tired of it; you don't want to see ice cream any more. People who eat ice cream occasionally won't understand this; how could you possibly want less ice cream?!

— Gary Bernhardt

I have some personal experience with software development burnout, and some tips for recognizing it, avoiding it, and simply dealing with it.

410 GONE is the infamous move that Mark Pilgrim, Python Developer and human, made for unknown reasons to distance himself from the developer community, and I assume, retain his own sense of identity while feeling overwhelmed by the pressure of being a "leader" in open source. He, suddenly, deleted all of his public code from the internet entirely, leaving all of his users to re-host unofficial remnants of his useful legacy.
programming  softwaredevelopment  devel  stress  career  burnout  410gone  balance 
january 2017 by kme
Don’t Fall Prey to Selection Bias in Your Career Choice - Facts So Romantic - Nautilus
Selection bias is not the only problem with interpreting our beliefs about careers. The psychologist Peter Wason invented the term “confirmation bias” to critique the way that people tend to focus on hypotheses they already believe, seeking confirmation, and ignore other possible hypotheses without testing them. For example, if I decided as a child that I only liked red fruits, I could easily go through life eating strawberries, watermelon, and cherries, and telling myself each time, “Yes, it’s true—I only like red fruit.” But this is mistaken thinking. What I really need to do is occasionally taste some pineapples, pears, and blueberries; perhaps I’ll discover that many non-red fruits are delicious as well.
Similarly, I might originally tell myself that I could never handle an office job, or that I hate working with numbers, or that I only want to live in New York. While it’s possible to go through life only considering careers that match my preconceptions, if I proactively try to disprove my beliefs, I might be pleasantly surprised.

If you’re still in high school or college, perhaps it also makes sense to try casting your net wider for work-shadowing or internships and testing out some options that might surprise you, suggests mathematics writer Kalid Azad. Many of us, he suspects, fall prey to hyperbolic discounting, a model that suggests we over-value immediate rewards and under-value rewards in the future.

“I could spend a week now shadowing a professor, seeing what the job is really like, but I want to go to on a trip instead,” says Azad. “I’m trading information that could improve the next 20 years of my life for one event! But because of hyperbolic discounting, the net present value of the next 20 years is so low I think the one week of effort to shadow the professor isn’t worth it—and anyway, the next 20 years is a problem for future-me.”
career  employment 
october 2016 by kme
Jayesh Lalwani's answer to Since programming can be self-taught, why not major in something other than computer science? - Quora
Since programming can be self-taught, why not major in something other than computer science?

This...This exact assumption behind the question is what holds people in software industry back : The idea that Computer science is just programming is a myth. Calling a computer scientist/software engineer a programmer is like calling a medical scientist a microscope looker, or a calling diplomat a meeting attendee, or calling a surgeon a meat cutter, or calling an architect a drafter. Your job is not defined by the physical actions that you do as part of the job. Your job is defined by the goals that you are trying to achieve.

If you have just passed out with a undergraduate/graduate degree in a field related to computing , get it in your head : I'm not a programmer. Programming is what I do. I'm not a programmer. A lot of entry level (and dare I say even mid to senior level ) developers get into a rut because they don't get this in their head. They expect clear requirements and clear deadlines, and a clear design, and they expect that their job is to take a set of clear requirement/designs and convert it into code. Then, they sit around expecting someone else to generate clear requirements and clear design and mope about when they don't have it. They expect that handling ambiguity is someone else's job. Why? "I'm a programmer. I program. Tell me what to program. Because I'm a programmer, and I want to program all day"

No. Wait let me think about it... NO! You are not a programmer. Programming is what you do. Sure, the best way to program is to resolve all ambiguities before you sit down to code. However, that doesn't mean that it's someone else's job. You are a computer scientist/software engineer. It is you who should be resolving the ambiguities. It is you who should be digging and digging till you understand the business need. It is you who should be figuring out the best way to meet the business need. You are a ambiguity resolver. Programs are what you produce as part of the ambiguity resolutions process . Indeed in an agile world, programs are used as a tool to resolve business ambiguities. When the business need is ambiguous, you implement it one way, try it out, see whether it works, and if doesn't you implement it another way. The programs that you produce are trying to achieve a goal that goes beyond the program itself. Your job is to achieve that goal.. not to program

But wait! I am begging a question here, aren't I? I am making an assumption that a person trained in computing is the best person to solve these higher goals. After all, if you are in making an application for doctors, isn't a doctor the best person to say what the application should do? Or in other words, Aren't domain experts the best people to resolve ambiguities? which is what this question boils down to. After all, they know the domain better than professional technologists.

The thing is. It's been tried before. Before we had professional technologists, computing was seen as a tool to make jobs easier. Computers were thought of as really great calculators. And just like a physicists/engineers/scientist would be trained on use of calculators, they would be trained on the use of computers, which involved learning how to program. There were no professional programmers. Programming was what people of other professions did. There were several problems with this

1) Non professional programmers create crap programs
Just about anyone with moderate intelligence can program. There are a few people who can program well. Writing correct, maintainable, scalable code is hard. Doing it in an efficient manner is even harder. Doing it in a sustainable manner with a team of programmers is even more harder. Eventually, what happened is physicists/engineers/scientists started having dedicated programmers, and then they realized that they have these people who program 100% of the time, but they have been trained to do something else. Early "programmers" like Turing and Von Nuemann were mathematicians, who started building concepts around computing. Then, later people like Dennis Ritchie who had degrees in physics and mathematics, converted those concepts into tools , essentially creating an engineering field out of it. As the needs of computer programs became more complex, and the tools became more sophisticated, there was a growing need to have people trained on those tools and problems. This lead to Universities creating training programs to teach people on those tools and problems

2) Humans are creatures of ambiguity. Computers are creatures of specification
End of the day, humans are creature of ambiguity. Even the most intelligent people communicate in ways that leads to a lot of scope for misunderstanding. Just because you are a world-renowned scientist doesn't mean you are good at resolving ambiguities. Most of the time we don't even realize it. Once we get our point across to other humans, we don't need to.
Computers need everything spelled out for them. They are creatures of exact specifications. Someone has to resolve the ambiguities in the process of creating computer programs?
The question is who is better at it, the technologist or the domain expert. I know a lot of "programmers" would have the domain experts do it. These same programmers also complain about how crappy the requirements are without doing anything about it. End of the day, the person who knows what the best way for a computer to do things is the technologist. The technologist understanding the domain creates a more elegant solution. You need to have an understanding of how things work behind the scene. It's not just about how to write a program. To have an elegant solution, you need to know how compilers work, and how OS works, and how databases work, and how disks spin, and how bytes float around the network. Also, having the technologist understand the domain allows the technologist to plan for the future, which allows him/her to write maintainable code cheaper.

Of course, the technologist needs help from the domain expert. However, it;s the technologist's responsibility. It's the "programmer's" responsibility to have clear requirements.. not the BA's, not the manager's, not the architect's... the programmer's responsibility. Throwing the job requirement ball over the wall leads to bad, unmaintainable code.

The task of requirements gathering has to be a joint effort between the technologist and the domain expert, and the technologist cannot divorce him/herself from the messy parts by calling him/herself a "programmer"

3) Not everyone can program
Generally speaking, Physicists make good programmers. Mathematicians make good programmers. Financial analysts don't make good programmers. Artists don't make good programmers. People have difficulty understanding computers. Long time ago, computers started up with a prompt, and you had to specify what you wanted the computer to do by writing a program. Most people couldn't use those computers. The only people who used them were people who could program (as a job or as a hobby). The history of personal computing (and UX design in general) is been a series of evolutionary steps that allows the layperson to be divorced from the inner workings of the computer

In a nutshell, Programmers program. Computer engineers/scientists program better than programmers because they do it in a manner that reduces long term costs. It's better for a computer engineer/scientist to be involved in learning the domain than the domain expert to learn technology, because that lends to elegant solutions.
programming  career  cs 
july 2015 by kme
Brian Bi's answer to What are the top 10 pieces of career advice Brian Bi would give to future software engineers? - Quora
Develop empathy for both your fellow engineers and your users. This advice was given to me by my intern manager at Facebook. Every time you write code, put yourself in the shoes of someone else who will have to maintain it. Every time you implement a new feature, put yourself in the shoes of the average person who will use it. These skills will make you a better engineer, as you'll be better at working as part of a team, and the products you build will be more successful. Too many engineers don't understand this.
devel  programming  software  career  advice 
march 2015 by kme
Advice From An Old Programmer — Learn Python The Hard Way, 2nd Edition
Programming as a profession is only moderately interesting. It can be a good job, but you could make about the same money and be happier running a fast food joint. You're much better off using code as your secret weapon in another profession.
programming  sageadvice  career  advice  tipsandtricks  python 
june 2012 by kme

Copy this bookmark:

to read