AI has been around for decades. The following describes why AI has received a surge of attention recently. It also describes the forms of knowledge that AI is impacting and the limits of AI’s capabilities within various forms of knowledge. Increasingly, expertise involves the coordination of systems as well as multiple forms of knowledge. Several areas of knowledge are accessible to people and not accessible to AI. Also, there are areas of knowledge that require human expertise in order for AI to increase the expertise sought. It concludes with recommendations to salespeople interacting with AI including areas of focus, authenticity, trust clusters, and social networks, as they seek to use AI and best position themselves in fields with prospects.
Multiple forms of knowledge
Techne (craft/art knowledge) is a form of knowledge recognized as work expertise in organizations. It includes the recognized skills, technical, and managerial used to create and do things. Techne is transferable “how to” knowledge. Recognized for their techne a person can “hang a shingle” labeling themselves as an attorney, marketer, or salesperson and claiming expertise in a profession.
The shingle, so to speak, is a social signal that a person is claiming specialized expertise in a specific category of techne. The social claim can be discounted or disproven by other social actors, institutions, professional organizations, etc. based on their judgment of the results or expertise delivered.
Two types of techne
Techne1 involves mathematical-based expertise, such as measuring and constructing a boat that floats or completing the math of financial statements that balance. If the end result of a techne1-based claim is failure – a sinking boat, or unbalanced financial statements – the person cannot claim they have expertise.
Techne2 recognizes a process of improving end results “better than chance” is a form of expertise. For example, even though doctors practicing medicine will lose patients to disease they can claim expertise. Also, sales professionals complete deals with only a fraction of total prospects, yet they can also claim expertise.
Techne2: Passing the law school exam
Interest in AI surged when ChatGPT ventured from the techne1 to the techne2 area of knowledge (see above graphic). The category of generative AI made this possible. AI began participating in professional techne2 areas, such as a law school exam, where expertise didn’t need to result in a mathematically correct end result. The result simply needed to demonstrate expertise “better than chance.” Generative AI did so, for example, by first passing the law school exam at the bottom of the class rankings and then at the top of the law school class ranking.
Opening areas of techne2 expertise to AI is a massive expansion. The expansion touches on areas of craft and artistic expertise previously held by professionals working with text, images, coding, etc. However, techne2 expertise often also requires contextualized knowledge that AI cannot access.
Expertise as disposition
AI is taking in inputs provided by people in the form of text and data, which are representations of specific situations. Further, these situations are interpreted by people and then reinterpreted and transcribed back as representations outputted by the AI. In effect, AI outputs are based on the inputs it receives. If the inputted representations are based on low levels of techne2 expertise then the corresponding output will exhibit low expertise.
Techne2 expertise is especially sensitive to the nuances of specific situations. For example, an expert salesperson knows when a prospect is able to buy based on a host of factors, including history, power dynamics, tone, mood, and ability to “read the room” in social situations. A “feel for the game” by an expert salesperson involves their disposition.
Dispositions are informed by many lived experiences specific to the area of expertise, often in highly complex and shifting situations, as well as the areas of expertise adjacent to and around the specific area of techne2 expertise, such as sales. A salesperson, for example, may warm the environment of a sales interaction by referring to a sporting event, children, the weather, etc.
Limits on AI trust
The sales expert knows how and when to do so in a manner that builds trust. The manner in which it is done is socially authentic within the specific context and feel of the situation.
Even though the prospect in the social situation fully knows the self-interest of the sales expert is involved in the conversation about a sports team, the sales expert is able to navigate the interaction to a successful transaction. There is a difference in the trust generated between two transacting people and a person interacting with technology, even AI with the capacity to act with techne2 expertise.
The trust and authenticity generated by AI receiving representations as inputs and then delivering them as outputs are not the same as that a salesperson generates in a relationship with a prospect. Even if the AI trust signals are identical, the fact that AI is not embodied means the signals will not be interpreted similarly by the person receiving them.
Conclusions: Authenticity, trust clusters, and social networks
In an era of AI-based techne2 expertise, authenticity derived from a relationship-based form of embodied trust will increase in value. Also, technical trust clusters informed by social networks will serve as places where prospects vet sales propositions. These trust clusters will parse techne2 sales artifacts (narratives, material, etc) and weigh their authenticity based on their AI and human characteristics. Salespeople attempting to navigate the fields where trust clusters are occurring and where authenticity is judged will rely on their social capital. They will use their social capital to build on claims of personal authenticity in order to further establish relational trust because they (their narrative, reputation) embodies the products they are selling.