{"id":679,"date":"2023-12-20T14:38:14","date_gmt":"2023-12-20T06:38:14","guid":{"rendered":"http:\/\/madapapa.com\/wordpress\/?p=679"},"modified":"2023-12-20T14:57:13","modified_gmt":"2023-12-20T06:57:13","slug":"openai-desdk-you-sheng-ji-le","status":"publish","type":"post","link":"http:\/\/madapapa.com\/wordpress\/?p=679","title":{"rendered":"openai\u7684sdk\u53c8\u5347\u7ea7\u4e86"},"content":{"rendered":"<pre><code class=\"language-plain_text\"> % pip3 install --upgrade openai\n<\/code><\/pre>\n<pre><code class=\"language-plain_text\">openai.ChatCompletion.create() -&gt; client.chat.completions.create()\n<\/code><\/pre>\n<pre><code class=\"language-plain_text\">    response = openai.chat.completions.create(\n        model = model,\n        messages = messages,\n        temperature = 0,\n    )\n    return response.choices[0].message.content\n<\/code><\/pre>\n<p>\u6211\u5b9e\u9645\u6ca1\u6709\u4fee\u6539client\uff0c\u4e5f\u662f\u6210\u529f\u7684\u3002\u4f46\u662f\u4ece\u4e0b\u9762\u5434\u6069\u8fbe\u7684\u4ecb\u7ecd\u770b\uff0cresponse\u4e5f\u8981\u4fee\u6539\u4e3a\u5c5e\u6027\u8c03\u7528\uff0c\u800c\u4e0d\u662f\u76f4\u63a5\u8c03\u7528\u3002<br \/>\n<img decoding=\"async\" src=\"http:\/\/madapapa.com\/wordpress\/wp-content\/uploads\/2023\/12\/17030544795793.jpg\" alt=\"\" \/><\/p>\n<p>\u4e0d\u8fc7\u7ecf\u8fc7chatgpt\u7684\u89e3\u91ca\uff0c\u6211\u8fd8\u662f\u4f7f\u7528\u4e86client\u8fd9\u4e2a\u5b9e\u4f8b\u6765\u8c03\u7528\u76f8\u5173\u51fd\u6570\uff0c\u539f\u56e0\u5982\u4e0b\uff08\u66f4\u7075\u6d3b\uff0c\u66f4\u597d\u652f\u6301\u5f02\u6b65\u64cd\u4f5c\u3001\u66f4\u7b26\u5408\u884c\u4e1a\u60ef\u4f8b\uff09<\/p>\n<pre><code class=\"language-plain_text\">The shift from using openai.chat to client.chat in the OpenAI API reflects a change in the design of the OpenAI Python library. Here are the key reasons for this change:\n\nClient Instance Configuration: \nUsing a client instance allows for more flexible configuration. You can create multiple client instances with different configurations (e.g., different API keys, different base URLs for testing or production environments) within the same application. This is more scalable and adaptable for complex applications or those that need to interact with multiple environments or configurations.\n\nAsynchronous Support: \nThe new API design is likely structured to better support asynchronous operations. By using a client object, it's easier to manage asynchronous requests and responses, which are becoming increasingly important for efficient handling of network operations, especially in web applications and services.\n\nConsistency with Other SDKs: \nMany modern API SDKs (Software Development Kits) use a similar pattern where you create a client instance and then use methods on this instance to interact with the API. This design pattern is familiar to many developers and aligns with best practices in software development.\n\nEncapsulation and Extensibility: \nBy using a client object, the OpenAI library can encapsulate functionality and state more effectively. This makes the library easier to extend and maintain. It also allows for better handling of resources like network connections.\n\nError Handling and Debugging: \nA client-based approach can offer improved error handling and debugging capabilities. Since each client instance can manage its own state and configuration, it's easier to trace and handle errors, log activities, and debug issues.\n\nIn summary, the shift to using a client instance in the OpenAI Python library is likely driven by a desire for greater flexibility, support for asynchronous operations, consistency with modern software development practices, improved encapsulation and extensibility, and enhanced error handling and debugging capabilities.\n<\/code><\/pre>\n<p>\u66f4\u8be6\u7ec6\u7684\u5f02\u6b65\u8c03\u7528\u7684\u597d\u5904<\/p>\n<pre><code class=\"language-plain_text\">The emphasis on asynchronous support in the new API design, particularly with the use of a client object, reflects a broader trend in software development towards more efficient and scalable network operations. Here's a deeper look into why this is beneficial:\n\n1. Improved Performance and Scalability:\nNon-blocking Calls: Asynchronous operations allow your application to make non-blocking network calls. This means that your application doesn't have to wait for a response from the OpenAI server before continuing with other tasks. This is especially beneficial in web applications where multiple users might be making requests at the same time.\nHandling Multiple Requests: Asynchronous programming is more efficient at handling multiple simultaneous network requests. This is crucial for high-load applications that need to maintain responsiveness under heavy request volumes.\n\n2. Better Resource Utilization:\nConcurrency: Asynchronous operations enable better utilization of system resources. While waiting for a response from an API call, your application can perform other tasks, thereby making better use of the CPU and other resources.\nReduced Latency: In a synchronous model, each operation must complete before the next one starts, potentially leading to higher latency. Asynchronous operations can overlap, which can reduce overall latency in the application.\n\n3. Enhanced User Experience:\nResponsive Applications: In a web or mobile application, asynchronous operations can significantly enhance the user experience. Users aren't left waiting for operations to complete and can continue interacting with other parts of the application.\nReal-time Updates: Asynchronous programming facilitates real-time updates to the user interface, which can be crucial for applications that require immediate feedback, such as chatbots or live data monitoring.\n\n4. Simplified Error Handling:\nAsynchronous workflows: Asynchronous programming often comes with more sophisticated ways to handle errors and exceptions. For instance, in Python's asyncio, you can use try\/except blocks within asynchronous functions to manage exceptions effectively.\n\n5. Alignment with Modern Web Standards:\nWebSockets and HTTP\/2: Modern web protocols like WebSockets and HTTP\/2 are designed to work efficiently with asynchronous communication, making it a natural fit for applications that leverage these technologies.\n\nImplementation in Python:\nasyncio Library: Python\u2019s asyncio library is a popular choice for writing asynchronous code. It provides a framework for writing single-threaded concurrent code using coroutines, multiplexing I\/O access, and running network clients and servers.\nIntegration with Frameworks: Many modern Python web frameworks (like FastAPI, Sanic, etc.) are built with native support for asynchronous operations, making it easier to integrate with asynchronous APIs like OpenAI's.\n\nIn summary, the shift towards asynchronous support with a client object in API design is a response to the growing need for more efficient, scalable, and responsive applications, particularly in the context of web and network services. It aligns well with modern software practices and technological advancements.\n<\/code><\/pre>\n<p><img decoding=\"async\" src=\"http:\/\/madapapa.com\/wordpress\/wp-content\/uploads\/2023\/12\/17030548288569.jpg\" alt=\"\" \/><\/p>\n<p><a href=\"https:\/\/github.com\/openai\/openai-python\/discussions\/742#async-client\">\u53c2\u8003\u8fd9\u7bc7\u6587\u6863<\/a><\/p>\n","protected":false},"excerpt":{"rendered":"<p>% pip3 install &#8211;upgrade openai openai.ChatCompletion.create() -&gt; client.chat.completions.create() response = openai.chat.completions.create( model = model, messages = messages, temperature = 0, ) return response.choices[0].message.content \u6211\u5b9e\u9645\u6ca1\u6709\u4fee\u6539client\uff0c\u4e5f\u662f\u6210\u529f\u7684\u3002\u4f46\u662f\u4ece\u4e0b\u9762\u5434\u6069\u8fbe\u7684\u4ecb\u7ecd\u770b\uff0cresponse\u4e5f\u8981\u4fee\u6539\u4e3a\u5c5e\u6027\u8c03\u7528\uff0c\u800c\u4e0d\u662f\u76f4\u63a5\u8c03\u7528\u3002 \u4e0d\u8fc7\u7ecf\u8fc7chatgpt\u7684\u89e3\u91ca\uff0c\u6211\u8fd8\u662f\u4f7f\u7528\u4e86client\u8fd9\u4e2a\u5b9e\u4f8b\u6765\u8c03\u7528\u76f8\u5173\u51fd\u6570\uff0c\u539f\u56e0\u5982\u4e0b\uff08\u66f4\u7075\u6d3b\uff0c\u66f4\u597d\u652f\u6301\u5f02\u6b65\u64cd\u4f5c\u3001\u66f4\u7b26\u5408\u884c\u4e1a\u60ef\u4f8b\uff09 The shift from using openai.chat to client.chat in the OpenAI API reflects a change in the design of the OpenAI Python library. Here are the key reasons for this change: &hellip; <a href=\"http:\/\/madapapa.com\/wordpress\/?p=679\" class=\"more-link\">Continue reading <span class=\"screen-reader-text\">openai\u7684sdk\u53c8\u5347\u7ea7\u4e86<\/span><\/a><\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"_bbp_topic_count":0,"_bbp_reply_count":0,"_bbp_total_topic_count":0,"_bbp_total_reply_count":0,"_bbp_voice_count":0,"_bbp_anonymous_reply_count":0,"_bbp_topic_count_hidden":0,"_bbp_reply_count_hidden":0,"_bbp_forum_subforum_count":0,"footnotes":""},"categories":[48,5],"tags":[],"class_list":["post-679","post","type-post","status-publish","format-standard","hentry","category-artificial-intelligence","category-linux"],"_links":{"self":[{"href":"http:\/\/madapapa.com\/wordpress\/index.php?rest_route=\/wp\/v2\/posts\/679","targetHints":{"allow":["GET"]}}],"collection":[{"href":"http:\/\/madapapa.com\/wordpress\/index.php?rest_route=\/wp\/v2\/posts"}],"about":[{"href":"http:\/\/madapapa.com\/wordpress\/index.php?rest_route=\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"http:\/\/madapapa.com\/wordpress\/index.php?rest_route=\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"http:\/\/madapapa.com\/wordpress\/index.php?rest_route=%2Fwp%2Fv2%2Fcomments&post=679"}],"version-history":[{"count":4,"href":"http:\/\/madapapa.com\/wordpress\/index.php?rest_route=\/wp\/v2\/posts\/679\/revisions"}],"predecessor-version":[{"id":684,"href":"http:\/\/madapapa.com\/wordpress\/index.php?rest_route=\/wp\/v2\/posts\/679\/revisions\/684"}],"wp:attachment":[{"href":"http:\/\/madapapa.com\/wordpress\/index.php?rest_route=%2Fwp%2Fv2%2Fmedia&parent=679"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"http:\/\/madapapa.com\/wordpress\/index.php?rest_route=%2Fwp%2Fv2%2Fcategories&post=679"},{"taxonomy":"post_tag","embeddable":true,"href":"http:\/\/madapapa.com\/wordpress\/index.php?rest_route=%2Fwp%2Fv2%2Ftags&post=679"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}