Patents & Innovation
Over dinner, a friend mentioned that she thought a particular country produced the most patents, and although I remember reading the same article about 10 years ago, - I believe it was in the NY Times - it is no longer true if it ever was. Looking at patents per capita, I found a variety of articles based on quality sources, and although the country does not rank in the top 10, it does rank well in Bloomberg's Innovation Index.
The latter is not solely based on patent numbers since one needs to consider other measures of innovation. Bloomberg's scoring includes indicators such as R&D spending, manufacturing, the number of high-tech companies, secondary education attainment, and the number of research personnel.
On a separate note, countries with large engineering and semiconductor industries and those that score well in international comparisons on science and math will dominate patents and innovation, as well as those countries with freer cultures, although this is synergistic, in that both the industries and social capital measures feed each other.
Some of my own informal research into Hofstede's cultural dimensions and patent production found that the two (2) dimensions with the highest correlations and P-values under .01 were Uncertainty Avoidance and Individuality. Essentially, cultures that tolerate ambiguity and are the least rule-based, along with having high individuality, produce a larger number of patents.
Because of the high tech industries they support, their high levels of education, and their generally free culture, Scandinavia performs well. It is similarly so for South Korea and Japan, although they generally do not have what we would think of as free cultures, being much more rigid and rule-based, they do have very high levels of technical education and industries that rely on those skills.
The Low Probability of Hiring Software Engineers
A fairly complicated description of hiring, and although somewhat obvious, more easily described by a simple probability equation. So, excluding the likelihood of getting past the recruiter:
P(hire) = P(phone screen) * P(sample project) * P(2 interview teams) * P(accepting)
Even including some kind of Bayesian inference, increasing odds for passing subsequent steps, or tilting candidate characteristics, it still leaves the probability of a hire fairly low, and an increased likelihood of a rejecting a good candidate, a false negative, but one can understand the aversion to a false positive, as it can be very expensive.
Source: Bayesian Inference for Hiring Engineers
AI in Software Development
Even before AI, I would have thought that work being done now would be automated, and of course, AI will replace some work - since developers automate tasks themselves, using rules, patterns, and processes - but the idea is always to stay ahead of the 'crushing wave' of new tech, often automating oneself out fo a job, thereby keeping your job...
BTW, this mentions interesting tools leveraging AI to help coders, rather than simply replacing them, since the latter is not currently a realistic scenario.
Source: Will A.I. Take Over Your Programming Job?
How do you deal with making sure your use of new technology is correct and free from code-smells?
Responding to How do you deal with making sure your use of new technology is correct and free from code-smells, security issues, etc.?
Issues can be dealt with in several ways.
Understanding what makes high-quality, maintainable code would be first, so knowledge of best practices regarding OOP, SOLID, design patterns, API design, etc. is important. Depending on what you mean by security, best practices in those regarding transfer protocols, coding styles, validation, storage, etc. are equally something one can learn.
Planning your work is useful, as a well thought out design is easier to implement, or at least will avoid future problems, than when you are just 'winging it'. Diagramming and project plans can be useful at this stage. Self-management is part of this, so using boards and epic/stories/tasks to track work is important, and there are free tools like Visual Studio Team Services (VSTS) or Trello to help.
Requirements gathering will matter so documentation and communication with users and/or clients will make a huge difference. Also, usability matters, so both understanding how to build code for others, whether it is a UI or an API will be important, to keep your clients happy and to avoid rework. With a UI, mockups can be useful, so using Balsamiq or Viso to put together the basics can be a starting point for discussing with users.
It would depend on your code stack. I work primarily in the Microsoft stack, so there are maintenance tools built into Visual Studio (VS) to check for code quality/maintainability and for code clones. Purchasing licenses for products like ReSharper can help. As part of the automated build process, VSTS has components for testing, code quality (Resharper), and build quality, executed on check-in
Independent of the stack, using TDD or unit tests are important, besides saving you time and effort. As an independent, it's tough to work in pairs, but code review can be useful, so enlisting someone to review your work can help.
Do Algorithms Make You a Better Developer?
Responding to a question on HashNode, Developers who practise algorithms are better at software development than people who just do development. Is it true?
, I wrote the following:
My feeling is that algorithms help make one a better programmer, but that is likely true of many coding concepts. I did not have algorithms as an undergraduate, so my knowledge is acquired through reading and practice, but after reading and applying Algorithm's in a Nutshell, I felt the quality of my work improved. That said, my development work increased more after understanding Design Patterns, or after consuming books on database design.
Since many types of knowledge improve developing and architecting abilities, one has to consider how it helps and to what degree. Algorithms are coding-in-the-small, often narrowly focused solutions, but which can have a great impact at scale. For many applications, a focus on algorithms would be overkill as data sets and requirements do not require it. In this context, any middling programmer can optimize a basic loop for performance. Proper database design, either relational or OLAP/OLTP, will make your applications better, from both maintenance and performance perspectives. Object-oriented programming makes some type of designs better, those that add objects, while learning correct functional programming helps in contexts where you are increasing functions on a limited number of objects. Learning enterprise architecture helps in the design of large scale operations.
One could equally argue that learning and practicing self-management, communication skills, and code management all make for better programmers, and they do. Ultimately, learning makes one a better developer.